This project recreates the demo objective:
- MCP server exposing two tools.
- Test with
@modelcontextprotocol/inspector. - LangChain/LangGraph-compatible agent using MCP over Streamable HTTP.
- n8n AI Agent setup using MCP over Streamable HTTP.
The OpenAI key pasted in the prompt was exposed in chat. Revoke it and create a new key before running the agent. Put the new key in .env; do not commit it.
Copy-Item .env.example .env
npm installEdit .env and set:
OPENAI_API_KEY=your_new_key
npm run serverThe MCP endpoint is:
http://127.0.0.1:3000/mcp
The health endpoint is:
http://127.0.0.1:3000/health
In one terminal, keep the server running:
npm run serverIn another terminal, list the tools with the Inspector CLI:
npm run inspector:list-toolsOn Windows, the current Inspector CLI can print the correct JSON response and then exit with a Node/libuv assertion. If you see the tools JSON containing calculate and text_stats, the MCP call itself succeeded.
Call a tool with the Inspector CLI:
npx --yes @modelcontextprotocol/inspector --cli http://127.0.0.1:3000/mcp --transport http --method tools/call --tool-name calculate --tool-arg operation=add --tool-arg "numbers=[2,3,4]"You can also open the Inspector UI:
npm run inspectorThen select:
Transport: Streamable HTTP
URL: http://127.0.0.1:3000/mcp
npm run smokeThis lists the MCP tools and calls calculate.
Make sure the server is running and .env contains a valid rotated OPENAI_API_KEY.
npm run agentCustom prompt:
npm run agent -- "Calcule 42 / 6 puis analyse le texte: Bonjour depuis MCP."Follow docs/n8n-agent.md.
The n8n MCP Client Tool configuration is:
Endpoint: http://127.0.0.1:3000/mcp
Server Transport: HTTP Streamable
Authentication: None
Tools to Include: All