Skip to content

aerostackdev/sdk-openai

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

2 Commits
 
 
 
 
 
 
 
 
 
 

Repository files navigation

@aerostack/sdk-openai

Use Aerostack workspace tools as OpenAI function-calling tools.

Aerostack is the full-stack platform for AI agents — compose MCP servers, skills, and functions into a single workspace URL that any AI agent can call. This SDK lets you drop 250+ pre-built tools into your OpenAI app in 3 lines of code.

npm version License: MIT

Why?

Without Aerostack, connecting your OpenAI app to external services means writing custom API integrations, managing auth, handling errors, and maintaining each one. With Aerostack, you compose a workspace of tools (GitHub, Slack, Notion, Stripe, 250+ more) and this SDK makes them available to OpenAI in one call.

Your App → OpenAI → @aerostack/sdk-openai → Aerostack Workspace → GitHub, Slack, Notion, ...

Install

npm install @aerostack/sdk-openai openai

Quick Start

import OpenAI from 'openai';
import { getTools, handleToolCalls } from '@aerostack/sdk-openai';

const openai = new OpenAI();
const config = { workspace: 'my-workspace', token: 'mwt_...' };

// 1. Fetch workspace tools in OpenAI format
const { tools } = await getTools(config);

// 2. Use them in a chat completion
const response = await openai.chat.completions.create({
    model: 'gpt-4o',
    messages: [{ role: 'user', content: 'Create a GitHub issue for the login bug' }],
    tools,
});

// 3. Execute tool calls and get results
const message = response.choices[0]?.message;
if (message?.tool_calls) {
    const results = await handleToolCalls(message.tool_calls, config);
    // results are ChatCompletionToolMessageParam[] — append to messages and continue
}

Factory Pattern

For reusable clients that cache tool name mappings:

import { createAerostackOpenAI } from '@aerostack/sdk-openai';

const aero = createAerostackOpenAI({ workspace: 'my-workspace', token: 'mwt_...' });

const { tools } = await aero.tools();
// ... after getting tool_calls from OpenAI ...
const results = await aero.handleToolCalls(toolCalls);

Multi-Turn Conversation

const messages = [{ role: 'user', content: 'Find open bugs in GitHub and post a summary to Slack' }];

while (true) {
    const response = await openai.chat.completions.create({
        model: 'gpt-4o',
        messages,
        tools,
    });

    const msg = response.choices[0]?.message;
    messages.push(msg);

    if (!msg?.tool_calls) break; // No more tool calls — done

    const results = await handleToolCalls(msg.tool_calls, config);
    messages.push(...results);
}

API Reference

getTools(config)Promise<ToolSet>

Fetches tools from the workspace and converts to OpenAI ChatCompletionTool[] format. Returns { tools, nameMap, raw }.

handleToolCall(toolCall, config, nameMap?)Promise<ChatCompletionToolMessageParam>

Executes a single tool call. Errors are returned as tool messages (not thrown), because OpenAI expects a tool result even when tools fail.

handleToolCalls(toolCalls, config, nameMap?)Promise<ChatCompletionToolMessageParam[]>

Executes multiple tool calls in parallel. Returns results in the same order as input.

createAerostackOpenAI(config)AerostackOpenAIClient

Creates a reusable client with a shared WorkspaceClient and cached name mappings. Auto-fetches tools on first handleToolCall if needed.

convertTools(mcpTools) / sanitizeToolName(name) / formatToolResult(result)

Lower-level utilities for custom integrations.

How It Works

  1. Tool DiscoverygetTools() calls your Aerostack workspace gateway via JSON-RPC to fetch all connected MCP server tools
  2. Format Conversion — MCP tool schemas (JSON Schema) are mapped to OpenAI's ChatCompletionTool format with name sanitization for the 64-char limit
  3. Execution — When OpenAI returns tool calls, handleToolCalls() proxies them back through the workspace gateway to the actual MCP servers
  4. Error Handling — Workspace errors are caught and returned as tool result messages, not thrown, keeping the conversation flow intact

Requirements

  • Node.js 18+
  • openai SDK >= 4.20.0
  • An Aerostack workspace with a token (mwt_...)

Getting Your Workspace Token

  1. Sign up at app.aerostack.dev
  2. Create a workspace and add MCP servers (GitHub, Slack, Notion, etc.)
  3. Copy the workspace token (mwt_...) from the workspace settings

Related Packages

Package Framework
@aerostack/sdk-vercel-ai Vercel AI SDK
@aerostack/sdk-langchain LangChain.js
@aerostack/core Core types + WorkspaceClient

Links

License

MIT

About

Use Aerostack workspace tools as OpenAI function-calling tools — drop-in ChatCompletionTool adapter with parallel execution and error handling

Topics

Resources

Stars

Watchers

Forks

Packages

 
 
 

Contributors