-
Notifications
You must be signed in to change notification settings - Fork 32
π€ feat: add HTTP/SSE MCP servers and usage telemetry #1202
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. Weβll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Conversation
Change-Id: I9b4ba646bded38efa160d25e5d4514db2350c06c Signed-off-by: Thomas Kosiewski <tk@coder.com>
β¦rt field Change-Id: I7271e9951ede4d7f9963aad8daeaeea7ead1086c Signed-off-by: Thomas Kosiewski <tk@coder.com>
1e88199 to
4f4449a
Compare
|
@codex review |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
π‘ Codex Review
Here are some automated review suggestions for this pull request.
βΉοΈ About Codex in GitHub
Codex has been enabled to automatically review pull requests in this repo. Reviews are triggered when you
- Open a pull request for review
- Mark a draft as ready
- Comment "@codex review".
If Codex has suggestions, it will comment; otherwise it will react with π.
When you sign up for Codex through ChatGPT, Codex can also answer questions or update the PR, like "@codex address that feedback".
Change-Id: I Signed-off-by: Thomas Kosiewski <tk@coder.com>
|
@codex review |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
π‘ Codex Review
Here are some automated review suggestions for this pull request.
βΉοΈ About Codex in GitHub
Codex has been enabled to automatically review pull requests in this repo. Reviews are triggered when you
- Open a pull request for review
- Mark a draft as ready
- Comment "@codex review".
If Codex has suggestions, it will comment; otherwise it will react with π.
When you sign up for Codex through ChatGPT, Codex can also answer questions or update the PR, like "@codex address that feedback".
Change-Id: I95d4d351bdc317416d876df0c5111dad3210d8ba Signed-off-by: Thomas Kosiewski <tk@coder.com>
|
@codex review |
|
Codex Review: Didn't find any major issues. What shall we delve into next? βΉοΈ About Codex in GitHubCodex has been enabled to automatically review pull requests in this repo. Reviews are triggered when you
If Codex has suggestions, it will comment; otherwise it will react with π. When you sign up for Codex through ChatGPT, Codex can also answer questions or update the PR, like "@codex address that feedback". |
π€ feat: HTTP/SSE MCP servers + PostHog usage telemetry
Adds support for MCP servers over:
Also adds PostHog telemetry to measure MCP usage per message:
Validation:
make static-checkπ Implementation Plan
Add HTTP/SSE MCP server support
Goal
Enable Mux to connect to MCP servers over:
β¦in addition to the existing stdio/NDJSON transport.
This should work end-to-end through Settings β Projects (persisted to
.mux/mcp.jsonc), including Test and tool allowlisting.Recommended approach (net ~650 LoC, product code)
Use the existing
@ai-sdk/mcpclient wrapper (experimental_createMCPClient) and add:MCPServerManagerthat can start either:Why this approach
Config format
Continue supporting the current formats:
"name": "<command>"β stdioExtend object entries to support HTTP transports:
{ "servers": { // stdio (existing) "memory": "npx -y @modelcontextprotocol/server-memory", // streamable HTTP (current spec) "remote": { "transport": "http", // or omit if url is present (defaults to auto) "url": "http://localhost:3333/mcp", "headers": { "Authorization": { "secret": "MCP_TOKEN" }, "X-Client": "mux" } }, // legacy SSE transport "legacy": { "transport": "sse", "url": "http://localhost:3333/sse" }, // auto mode (try http then fallback to legacy sse) "auto": { "transport": "auto", "url": "http://localhost:3333/mcp" } } }Notes:
headersvalues are either:{ "secret": "<project-secret-key>" }(resolved via Settings β Projects β Secrets).Backend changes
1) Types + schemas
Update
src/common/types/mcp.ts:transport: 'stdio' | 'http' | 'sse' | 'auto'.command-only server shape with a discriminated union:{ transport: 'stdio'; command: string; β¦ }{ transport: 'http'|'sse'|'auto'; url: string; headers?: Record<string, string | {secret: string}>; β¦ }disabled+toolAllowlistas common fields.Update Zod schemas in
src/common/orpc/schemas/mcp.tsto match.projects.mcp.addinput from{name, command}to something like:{ name, transport, command?, url?, headers? }.projects.mcp.testinput to support testing URL-based servers (plustransport).2) Config parsing + persistence
Update
src/node/services/mcpConfigService.ts:normalizeEntryto accept:commandβ stdiourlβ http/auto (defaulttransport: 'auto'unless explicitly set)transport: 'sse'β legacy SSEsaveConfigto write:Add defensive validation:
command, HTTP entries haveurl.{secret: string}.3) Starting servers (stdio vs http/sse)
Update
src/node/services/mcpServerManager.ts:Change the βenabled serversβ map from
Record<string, string>toRecord<string, MCPServerInfo>(filtered to enabled).Add a helper:
resolveHeaders(serverInfo, projectSecrets): Record<string, string>{secret: key}look up in project secrets record.Add a helper to create clients:
createClientForServer(serverInfo, resolvedHeaders): Promise<MCPClient>MCPStdioTransportexperimental_createMCPClient({ transport: { type: 'http', url, headers } })experimental_createMCPClient({ transport: { type: 'sse', url, headers } })Keep existing behavior:
transformMCPResult) and allowlist filtering.4) Plumbing secrets into MCP startup
MCPServerManager.getToolsForWorkspace(fromAIService, which already loads secrets).MCPServerManagerto keep it testable.5) Update system prompt wiring
buildSystemMessagecan keep taking a map of server names; it only lists keys.Record<string, unknown>or a newEnabledMCPServerMaptype.Telemetry (PostHog)
Goals
Capture privacy-safe usage metrics that answer:
Privacy constraints
Follow
src/common/telemetry/payload.tsguidelines:New events + properties
Add new backend-emitted telemetry events (captured via
TelemetryService.capture, not renderertrackEvent):mcp_context_injected(emitted once perAIService.streamMessageright beforestartStreamβ even if no MCP servers are configured; counts may be 0)Properties (all safe):
workspaceId(stable random ID)modelmoderuntimeType(local|worktree|ssh)mcp_server_enabled_countmcp_server_started_countmcp_server_failed_countmcp_tool_counttotal_tool_countbuiltin_tool_countmcp_transport_mode(none|stdio_only|http_only|sse_only|mixed)mcp_has_http,mcp_has_sse,mcp_has_stdio(booleans)mcp_auto_fallback_count(how many auto servers required fallback; 0 if not tracked)mcp_setup_duration_ms_b2(time spent creating MCP clients + fetching tools)mcp_server_testedEmit from the
projects.mcp.testhandler after the test completes:transport(stdio|http|sse|auto)success(boolean)duration_ms_b2error_category(timeout|connect|http_status|unknown) without raw error stringsmcp_server_config_changedEmit from project MCP mutation handlers (
add/remove/setEnabled/setToolAllowlist):action(add|edit|remove|enable|disable|set_tool_allowlist|set_headers)transporthas_headers(boolean)uses_secret_headers(boolean)tool_allowlist_size_b2(only for allowlist updates)Backend implementation details
TelemetryEventPayloadunion insrc/common/telemetry/payload.tswith the new event types.src/common/orpc/schemas/telemetry.tseven if events are backend-only, so schema + documentation stay in sync.AIService.setTelemetryService(telemetryService: TelemetryService)(similar tosetMCPServerManager).ServiceContainerafter constructing services.AIService.streamMessage, measure MCP setup timing and emitmcp_context_injectedaftermcpTools+ finaltoolsare known.mcp_server_started_countreliably, return stats fromMCPServerManager.getToolsForWorkspace(e.g.{ tools, stats }) or add a separategetWorkspaceStatsAPI.Optional (nice-to-have): tool call telemetry
If you also want βhow often are MCP tools actually invoked?β add
mcp_tool_calls_summaryon stream-end:AIService(it already re-emits StreamManager tool events).This is a follow-up if we want to keep the first PR smaller.
PostHog dashboard
Create a dashboard (e.g., βMCP Usageβ) and add the following insights:
mcp_context_injected(count)mcp_context_injectedmcp_server_enabled_count)mcp_server_started_count)mcp_server_failed_count)mcp_context_injectedmcp_tool_count)total_tool_count)builtin_tool_count)mcp_context_injectedwith breakdown bymcp_transport_mode.mcp_context_injectedmcp_auto_fallback_count)mcp_auto_fallback_count > 0and count.mcp_server_testedbroken down bysuccessand/ortransport.mcp_server_config_changedbroken down byactionandtransport.Programmatic dashboard creation (optional)
If we want this automated (rather than manual UI steps), we can do it in Exec mode using the PostHog MCP tools:
This is best done after the events ship so PostHog has the event/property definitions.
Frontend (Settings UI) changes
1) List + edit servers
Update
src/browser/components/Settings/sections/ProjectSettingsSection.tsx:commandurl2) Add server form
Stdio(default)HTTP (Streamable)SSE (Legacy)Auto (HTTP β SSE)3) Headers UI (static + secret)
{secret: <key>}(typed input; optionally a dropdown populated viaprojects.secrets.get).Testing
Unit tests
Update and extend
src/node/services/mcpConfigService.test.ts:url+transportentriesAdd unit tests for MCP telemetry helpers (pure functions):
mcp_transport_mode)mcp_server_testedAdd a focused unit test file for header resolution + auto fallback decision logic (pure functions).
Integration test (recommended)
tests/integrationusing an official MCP server fixture (or a tiny in-test server) that exposes 1β2 tools over:This verifies end-to-end compatibility without mocks.
Rollout / compatibility notes
.mux/mcp.jsoncentries continue to work unchanged.Alternative considered: adopt @modelcontextprotocol/sdk transports directly
This can improve spec compliance (session IDs, retries, etc.) by using
StreamableHTTPClientTransport/SSEClientTransportdirectly.Itβs a reasonable follow-up if we hit limitations with
@ai-sdk/mcpβs built-in HTTP/SSE config, but it likely adds ~150β250 LoC + a new dependency.Generated with
muxβ’ Model:openrouter:openai/gpt-5.2β’ Thinking:high