Use your ChatGPT Plus or Pro subscription inside Claude Code, Codex, and any MCP client.
openai-mcp exposes 15 MCP tools that forward requests directly to ChatGPT's backend API. No proxy process. No separate account. Your token, your quota.
Works with Claude Code, Codex CLI, and any client that speaks the MCP protocol over stdio.
pip install openai-mcpOr with pipx for an isolated install:
pipx install openai-mcpopenai-mcp setupThe wizard will:
- Look for an existing Codex auth token at
~/.codex/auth.json - If not found, prompt you to paste a ChatGPT session token
- Save credentials to
~/.openai-mcp/token.json
Add the following to ~/.claude.json (under mcpServers):
{
"mcpServers": {
"openai": {
"type": "stdio",
"command": "openai-mcp",
"args": ["run", "--stdio"]
}
}
}Restart Claude Code after saving. Tools appear under the openai namespace.
| Tool | What it does |
|---|---|
chat |
Chat with GPT-5.x, Pro models, o3, o3-pro |
deep_research |
Web-augmented search answer (~30 s) |
deep_research_heavy |
Long-form Deep Research via gpt-5-5-pro (5–30 min, uses monthly quota) |
account_status |
ChatGPT plan and enabled features |
list_models |
All models available to your account |
memory_list |
List ChatGPT memories (PII redacted) |
memory_search |
Search ChatGPT memories by keyword |
custom_instructions_get |
Retrieve your ChatGPT custom instructions |
list_codex_envs |
List Codex environments |
list_codex_tasks |
List recent Codex tasks |
list_custom_gpts |
List your custom GPTs |
list_conversations |
Recent ChatGPT conversations |
list_tasks |
Scheduled ChatGPT tasks |
list_apps |
Connected apps and connectors |
Native Python implementation — no proxy. The server calls
/backend-api/conversation (SSE) directly using curl_cffi for TLS
impersonation. Vendored POW and Turnstile solvers handle the OpenAI Sentinel
challenge. See NOTICES for attribution.
~/.codex/auth.json (or ~/.openai-mcp/token.json)
|
openai-mcp (stdio MCP server)
|
curl_cffi → chatgpt.com /backend-api/conversation (SSE)
|
14 read tools + 1 heavy DR tool
- Deep Research quota: 248 requests/month on Pro; lower on Plus.
- image_gen: stub is present but not yet wired to a working endpoint.
- memory_add: read-only — the write endpoint returns 405; tool is not registered.
- Requires an active ChatGPT Plus or Pro subscription.
python -m venv .venv
source .venv/bin/activate
pip install -e ".[dev]"
pytestMIT. See NOTICES for third-party attributions.
- lanqian528/chat2api — POW and Turnstile solver code (MIT)
- basketikun/chatgpt2api — survey of ChatGPT backend API patterns
- 7836246/cursor2api — survey of Cursor API patterns