This project demonstrates how to set up an MCP (Model Context Protocol) server that acts as a bridge between Claude (or any MCP-compatible client) and the Synthasaurus API, allowing you to query custom AI agent (e.g., Co:Driver) and return responses.
- The server is built using
FastMCP
from themcp.server.fastmcp
package. - The
api_call
tool is exposed to the MCP client (e.g., Claude). When called, it:- Starts a new conversation with the Synthasaurus API.
- Sends the user's message to agent (e.g., Co:Driver) via the API.
- Polls for the AI agent's response if the API responds asynchronously.
- Returns the agent's response back to the MCP client.
Set the following environment variables before running the server:
AUTH_TOKEN
: Your API authorization token.
- cd mcp-server-demo
- Install dependencies (using uv):
uv add "mcp[cli]"
- Running the standalone MCP development tools
uv run mcp
- Set the required environment variables (see above).
- To add tool in claude app:
mcp install main.py
- you can test it with the MCP Inspector:
mcp dev server.py
- Connect to the server using an MCP-compatible client (e.g., Claude) and use the
api_call
tool to send a message to your agent.
Send a message to your agent via Claude or another MCP client:
- Tool:
api_call
- Input:
"What is co:driver?"
- The server will return the response from your agent via the Synthasaurus API.
See mcp-server-demo/main.py
for implementation details.