Framework Integrations
Aerostack provides native SDK integrations for the most popular AI frameworks. One workspace token gives any framework access to every MCP server, skill, and function you’ve connected.
Available Integrations
| Framework | Package | Install |
|---|---|---|
| Vercel AI SDK | @aerostack/sdk-vercel-ai | npm install @aerostack/sdk-vercel-ai |
| LangChain | @aerostack/sdk-langchain | npm install @aerostack/sdk-langchain |
| OpenAI SDK | @aerostack/sdk-openai | npm install @aerostack/sdk-openai |
How It Works
All three SDKs follow the same pattern:
- Connect to your Aerostack workspace with a token (
mwt_...) - Fetch tools — the SDK discovers all MCP tools in your workspace
- Convert — tools are automatically converted to the framework’s native format
- Execute — when the LLM calls a tool, the SDK proxies the call through the workspace gateway to the actual MCP server
Your Code → SDK → Workspace Gateway → MCP Server → Result → SDK → Your CodeWhich One Should I Use?
- Vercel AI SDK — Simplest integration. Tools auto-execute with
maxSteps. Best for Next.js apps and streaming UIs. - LangChain — Best for agent workflows with LangGraph. Works with any LLM provider (OpenAI, Anthropic, Google, Groq).
- OpenAI SDK — Fine-grained control over the tool call loop. Best when you need to inspect or modify tool calls before execution.
Prerequisites
All integrations require:
- An Aerostack account (sign up free)
- A workspace with at least one MCP server connected
- A workspace token (
mwt_...) — generate from your workspace settings
Don’t Need an SDK?
If you’re using an MCP-compatible client (Claude, Cursor, VS Code), you can connect directly via MCP config — no SDK needed. See Agent Automation.