A Model Context Protocol (MCP) server built with the official MCP SDK, designed to be deployed on Koyeb and integrated with the OpenAI Apps SDK.
This MCP server demonstrates how to create AI-accessible tools and widgets that can be used by OpenAI's GPT models through the Apps SDK. The server runs as an HTTP service and exposes MCP capabilities via a streamable HTTP transport.
- UI Widget: An HTML interface that displays the todo list in ChatGPT (
ui://widget/todo.html)
- add_todo: Creates a new todo item with the given title
- complete_todo: Marks a todo item as completed by its ID
- Node.js 20+
- Docker (for containerized deployment)
- A Koyeb account
- OpenAI Apps SDK access
-
Clone the repository
git clone <your-repo-url> cd example-mcp-server
-
Install dependencies
npm install
-
Run the server
node server.js
The server will start on
http://0.0.0.0:8787by default (or port 8080 in production).
- Push your code to a GitHub repository
- Go to Koyeb Dashboard
- Click "Create Service"
- Select "GitHub" as the deployment method
- Choose your repository
- Koyeb will automatically detect the Dockerfile and deploy
# Build the Docker image
docker build -t mcp-server .
# Run locally to test
docker run -p 8080:8080 -e PORT=8080 mcp-serverPORT: The port the server listens on (default: 8787 locally, 8080 in production)
Once deployed on Koyeb, you'll receive a public URL. To use this MCP server with the OpenAI Apps SDK:
-
Get your Koyeb deployment URL (e.g.,
https://your-app.koyeb.app) -
Important: The MCP endpoint is at
/mcp, so your full URL will be:https://your-app.koyeb.app/mcp -
Configure in OpenAI Apps SDK:
- Add the server URL to your ChatGPT settings
- The server will appear as "todo-app" in the MCP servers list
-
The OpenAI model will now be able to:
- Add todos to your list
- Mark todos as completed
- Display an interactive widget showing all todos
.
├── server.js # MCP server implementation with Node.js
├── public/
│ └── todo-widget.html # Web component for ChatGPT UI
├── Dockerfile # Container configuration for Node.js
├── package.json # Node.js dependencies
└── README.md # This file
- MCP SDK creates an MCP server with tools and a UI resource
- UI Widget (HTML file) gets served as a resource and rendered in ChatGPT's iframe
- StreamableHTTPServerTransport exposes the MCP protocol over HTTP at the
/mcpendpoint - Node.js HTTP Server serves the application
- OpenAI Apps SDK connects to the server, displays the UI, and makes tools available to GPT models
- When a tool is called, the result is passed to the widget via
window.openai.toolOutput
To add a new tool, use server.registerTool():
server.registerTool(
"tool_name",
{
title: "Tool Title",
description: "Description of what your tool does",
inputSchema: {
param: z.string().min(1),
},
_meta: {
"openai/outputTemplate": "ui://widget/your-widget.html",
"openai/toolInvocation/invoking": "Running tool",
"openai/toolInvocation/invoked": "Tool completed",
},
},
async (args) => {
// Your implementation here
return {
content: [{ type: "text", text: "Result message" }],
structuredContent: { /* data for widget */ },
};
}
);Edit public/todo-widget.html to customize how tool results are displayed in ChatGPT. The widget receives tool output via window.openai.toolOutput and can listen for updates using the openai:set_globals event.
- Ensure the server binds to
0.0.0.0(not127.0.0.1) - Verify the
PORTenvironment variable is set correctly
- Make sure you're connecting to
/mcpendpoint, not just the root URL - Example:
https://your-app.koyeb.app/mcp(nothttps://your-app.koyeb.app/)
- Check that Koyeb deployment is healthy
- Verify the service is listening on the correct port
- Check browser console for errors
- Verify that
structuredContentis being returned from tool handlers - Ensure the widget is reading from
window.openai.toolOutput
MIT