Use Aerostack workspace tools as OpenAI function-calling tools.
Aerostack is the full-stack platform for AI agents — compose MCP servers, skills, and functions into a single workspace URL that any AI agent can call. This SDK lets you drop 250+ pre-built tools into your OpenAI app in 3 lines of code.
Without Aerostack, connecting your OpenAI app to external services means writing custom API integrations, managing auth, handling errors, and maintaining each one. With Aerostack, you compose a workspace of tools (GitHub, Slack, Notion, Stripe, 250+ more) and this SDK makes them available to OpenAI in one call.
Your App → OpenAI → @aerostack/sdk-openai → Aerostack Workspace → GitHub, Slack, Notion, ...
npm install @aerostack/sdk-openai openaiimport OpenAI from 'openai';
import { getTools, handleToolCalls } from '@aerostack/sdk-openai';
const openai = new OpenAI();
const config = { workspace: 'my-workspace', token: 'mwt_...' };
// 1. Fetch workspace tools in OpenAI format
const { tools } = await getTools(config);
// 2. Use them in a chat completion
const response = await openai.chat.completions.create({
model: 'gpt-4o',
messages: [{ role: 'user', content: 'Create a GitHub issue for the login bug' }],
tools,
});
// 3. Execute tool calls and get results
const message = response.choices[0]?.message;
if (message?.tool_calls) {
const results = await handleToolCalls(message.tool_calls, config);
// results are ChatCompletionToolMessageParam[] — append to messages and continue
}For reusable clients that cache tool name mappings:
import { createAerostackOpenAI } from '@aerostack/sdk-openai';
const aero = createAerostackOpenAI({ workspace: 'my-workspace', token: 'mwt_...' });
const { tools } = await aero.tools();
// ... after getting tool_calls from OpenAI ...
const results = await aero.handleToolCalls(toolCalls);const messages = [{ role: 'user', content: 'Find open bugs in GitHub and post a summary to Slack' }];
while (true) {
const response = await openai.chat.completions.create({
model: 'gpt-4o',
messages,
tools,
});
const msg = response.choices[0]?.message;
messages.push(msg);
if (!msg?.tool_calls) break; // No more tool calls — done
const results = await handleToolCalls(msg.tool_calls, config);
messages.push(...results);
}Fetches tools from the workspace and converts to OpenAI ChatCompletionTool[] format. Returns { tools, nameMap, raw }.
Executes a single tool call. Errors are returned as tool messages (not thrown), because OpenAI expects a tool result even when tools fail.
Executes multiple tool calls in parallel. Returns results in the same order as input.
Creates a reusable client with a shared WorkspaceClient and cached name mappings. Auto-fetches tools on first handleToolCall if needed.
Lower-level utilities for custom integrations.
- Tool Discovery —
getTools()calls your Aerostack workspace gateway via JSON-RPC to fetch all connected MCP server tools - Format Conversion — MCP tool schemas (JSON Schema) are mapped to OpenAI's
ChatCompletionToolformat with name sanitization for the 64-char limit - Execution — When OpenAI returns tool calls,
handleToolCalls()proxies them back through the workspace gateway to the actual MCP servers - Error Handling — Workspace errors are caught and returned as tool result messages, not thrown, keeping the conversation flow intact
- Node.js 18+
- openai SDK >= 4.20.0
- An Aerostack workspace with a token (
mwt_...)
- Sign up at app.aerostack.dev
- Create a workspace and add MCP servers (GitHub, Slack, Notion, etc.)
- Copy the workspace token (
mwt_...) from the workspace settings
| Package | Framework |
|---|---|
@aerostack/sdk-vercel-ai |
Vercel AI SDK |
@aerostack/sdk-langchain |
LangChain.js |
@aerostack/core |
Core types + WorkspaceClient |
MIT