A production-ready Model Context Protocol server implemented in TypeScript. The server provides:
- OpenAI connectivity demo – prove the API key works end-to-end via
npm run demo:openai. - MCP tool demo – spawn the server and call tools through an MCP client using
npm run demo:tool. - Extensibility demo – hot-load third-party tools from disk via
npm run demo:extorMCP_TOOL_MODULES. - Browser UI demo – launch an interactive web page that exercises the OpenAI call and knowledge-search tool with
npm run demo:ui.
The codebase focuses on clean abstractions, schema validation, and commercial readiness (logging, config safety, tests).
- Node.js 18+ (Node 20 recommended to avoid optional engine warnings).
- npm 9+.
- A valid
OPENAI_API_KEYwith access to the desired models.
npm install
cp .env.example .env # fill in OPENAI_API_KEY
npm run build
npm start # runs the compiled MCP server on stdioTo run the TypeScript entry directly during development:
npm run dev| Variable | Description |
|---|---|
OPENAI_API_KEY |
Required. API key for OpenAI. |
OPENAI_BASE_URL |
Override base URL for Azure/OpenAI proxies. |
OPENAI_TIMEOUT_MS |
Timeout (ms) applied to OpenAI API calls. Defaults to 20000. |
MCP_SERVER_NAME |
Name advertised to MCP clients. |
LOG_LEVEL |
fatal → trace. Defaults to info. |
MCP_TOOL_MODULES |
Comma-separated absolute paths to extra tool modules (see extensibility demo). |
MCP_PORT |
Reserved for future transports; defaults to 7337. |
UI_DEMO_PORT |
Optional port for the browser UI demo. Defaults to 4399. |
Verifies credentials and model access:
npm run demo:openaiOutputs the model reply plus token usage metrics via Pino logs.
Spawns the compiled MCP server (node dist/index.js) and connects with the official MCP client SDK:
npm run build
npm run demo:toolSet MCP_DEMO_SERVER_COMMAND / MCP_DEMO_SERVER_ARGS if you want the client to launch a different command (for example npx tsx src/index.ts). The script lists tools and invokes knowledge_search end-to-end.
Ships with src/examples/plugins/stockQuoteTool.ts. After npm run build the compiled module lives at dist/examples/plugins/stockQuoteTool.js.
Load it either through the demo script:
npm run build
npm run demo:extor by setting an environment variable before starting the server:
export MCP_TOOL_MODULES=$(pwd)/dist/examples/plugins/stockQuoteTool.js
npm startThe server automatically registers every tool exported from the referenced module(s).
Launch a lightweight HTTP server that serves public/ui-demo.html:
npm run demo:uiVisit http://localhost:4399 (or UI_DEMO_PORT) to:
- Send prompts directly to OpenAI using the configured API key.
- Call the built-in
knowledge_searchtool through a REST façade.
Responses render inline so you can validate both flows without leaving the browser.
- TypeScript strict mode with
tscfor builds. - Vitest for unit testing (
npm test). - ESLint + Prettier for linting/formatting (
npm run lint,npm run format). - Pino structured logging with pretty printing in development.
npm run lint
npm testCoverage reports are emitted under coverage/ via V8 instrumentation.
src/config/env.ts– centralized, validated environment loading.src/clients/openaiClient.ts– resilient OpenAI wrapper implementing theLLMProvidercontract.src/mcp/registry.ts– tool lifecycle management + dynamic module loading.src/mcp/server.ts– MCP server wiring, tool adapters, and plugin APIs.src/demos/*– runnable scripts covering the three required scenarios.src/examples/plugins/*– sample plugin(s) for extensibility demos.tests/*– Vitest coverage for critical units.
For a deeper architectural overview, read docs/architecture.md.