You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
An OpenAI Responses API proxy for Gemini and Z.AI (GLM) providers.
Translates OpenAI's Responses API to Gemini and Z.AI APIs. Handles wire format differences, role mapping, and SSE stream formatting so Codex can use these providers instead of GPT.
Features
Responses API - Full lifecycle with SSE events
Multi-Provider - Gemini (OAuth2) and Z.AI (GLM) support
Context Compaction - Both Gemini and Z.AI models support
Tool Support - Function calling and web search
Docker Ready - Production container with hot-reload
Quick Start
# Clone and start
git clone https://github.com/cornellsh/codex-proxy.git
cd codex-proxy
# Start proxy (Docker)
./scripts/control.sh start
# Or run directly (Python 3.14+ required)
python -m codex_proxy
Configuration
Configuration lives at ~/.config/codex-proxy/config.json. Environment variables override all settings.