Skip to content

Local MCP servers broken? #5284

@croqaz

Description

@croqaz

What version of Codex is running?

codex-cli 0.46.0

What subscription do you have?

enterprise

Which model were you using?

gpt-5-codex

What platform is your computer?

Linux 6.17.3-2-cachyos x86_64 unknown

What issue are you seeing?

Hi. I can't make the local MCP servers work at all with Codex.
I tried both reference MCP servers from Python and Typescript.

What steps can reproduce the bug?

These are the MCP servers that I used: https://gist.github.com/croqaz/dfccc9c1cf5c9841428fd9f0c44f6014
Again, they are just the standard ones from Python or Typescript.

What is the expected behavior?

Should use the local MCP servers.

Additional information

This is my config:

[model_providers.openai]
name = "OpenAI"
base_url = "https://api.openai.com/v1"
env_key = "OPENAI_API_KEY"

[mcp_servers.mcp1]
command = "uv"
args = ["run", "mcp", "run", "server.py"]
cwd = "/mcp1"

[mcp_servers.mcp2]
command = "npx"
args = ["-y", "tsx", "server.ts"]
cwd = "/mcp2"

This is MCP1 -- it is started on STDIO but I'm just showing that it can run as a local server, it's a working server:

Image

This is MCP2 -- converted from a local HTTP server and it works:

Image

This is Codex CLI, which doesn't see any of the MCP servers:

Image

Metadata

Metadata

Assignees

Labels

bugSomething isn't workingmcpIssues related to the use of model context protocol (MCP) servers

Type

No type

Projects

No projects

Milestone

No milestone

Relationships

None yet

Development

No branches or pull requests

Issue actions