A modern web-based configuration interface for the Claude Code Proxy server with database-driven configuration management. π
A Next.js application with a modern web interface for managing your Claude Code Proxy server. Store configurations in a database, manage API keys securely, and control the proxy server directly from your browser. Built with React, TypeScript, Prisma, and SQLite for a complete configuration management solution.
- π Modern Web Interface: Built with Next.js and React for easy configuration management
- ποΈ Database Storage: Configuration settings stored in SQLite database with Prisma ORM
- π Dynamic Configuration: Change settings without restarting the proxy server
- π API Key Management: Securely store and manage API keys for different providers
- π― Model Mapping: Configure how Claude models map to other provider models
βΆοΈ Server Control: Start/stop proxy server directly from the web interface- π‘οΈ Type Safety: Full TypeScript support for better development experience
- API Translation: Converts between Anthropic API format and OpenAI/Gemini formats
- Streaming Support: Full support for Server-Sent Events (SSE) streaming responses
- Function Calling: Complete support for tool/function calling
- Multiple Providers: Support for OpenAI, Gemini, and Anthropic backends
- Node.js 18+
- npm or yarn package manager
npm install# Generate Prisma client and create database
npx prisma generate
npx prisma db pushnpm run devThe web interface will be available at http://localhost:3000
- Open http://localhost:3000 in your browser
- Navigate to the "API Keys" tab
- Add your API keys for the providers you want to use:
- OpenAI: Add your OpenAI API key
- Google: Add your Google AI API key for Gemini
- Anthropic: Add your Anthropic API key for Claude
- Go to the "Model Mappings" tab
- Configure how Claude models should map to other provider models
- Choose the target provider and model for each Claude model
- Click the "Start Proxy" button in the web interface
- The proxy server will start on port 8082 (configurable)
- You can now use it with Claude Code:
ANTHROPIC_BASE_URL=http://localhost:8082 claude| Variable | Description | Default | Required |
|---|---|---|---|
OPENAI_API_KEY |
OpenAI API key | - | If using OpenAI |
GEMINI_API_KEY |
Google AI Studio API key | - | If using Gemini |
ANTHROPIC_API_KEY |
Anthropic API key | - | If using Anthropic |
PREFERRED_PROVIDER |
Primary provider (openai, google, anthropic) |
openai |
No |
BIG_MODEL |
Model for Sonnet mapping | gpt-4.1 |
No |
SMALL_MODEL |
Model for Haiku mapping | gpt-4.1-mini |
No |
OPENAI_BASE_URL |
Custom OpenAI endpoint | - | No |
PORT |
Server port | 8082 |
No |
HOST |
Server host | 0.0.0.0 |
No |
LOG_LEVEL |
Logging level | warn |
No |
The proxy automatically maps Claude models based on your configuration:
| Claude Model | Default Mapping | When using Gemini provider |
|---|---|---|
claude-3-haiku |
openai/gpt-4.1-mini |
gemini/[SMALL_MODEL] |
claude-3-sonnet |
openai/gpt-4.1 |
gemini/[BIG_MODEL] |
- o3-mini, o1, o1-mini, o1-pro
- gpt-4.5-preview, gpt-4o, gpt-4o-mini
- chatgpt-4o-latest, gpt-4o-audio-preview
- gpt-4.1, gpt-4.1-mini
- gemini-2.5-flash
- gemini-2.5-pro
- Create
.envfile with your configuration - Start the service:
docker-compose up -d
# Build the image
docker build -t claude-proxy-nodejs .
# Run the container
docker run -d \
--name claude-proxy \
--env-file .env \
-p 8082:8082 \
claude-proxy-nodejsPOST /v1/messages- Create a message (supports streaming)POST /v1/messages/count_tokens- Count tokens in a request
GET /- Health check and service info
nodejs-version/
βββ src/
β βββ config/ # Configuration management
β βββ middleware/ # Express middleware
β βββ models/ # Zod schemas for validation
β βββ routes/ # API route handlers
β βββ services/ # Core business logic
β β βββ converter.js # Format conversion
β β βββ litellmClient.js # API client
β β βββ modelMapper.js # Model mapping
β β βββ streamHandler.js # Streaming support
β βββ utils/ # Utilities (logging, etc.)
β βββ server.js # Main application entry
βββ package.json
βββ Dockerfile
βββ docker-compose.yml
# Install dependencies
npm install
# Start in development mode (with auto-reload)
npm run dev
# Run linting
npm run lint
# Format code
npm run format- Receives requests in Anthropic's API format π₯
- Validates and maps models according to configuration π
- Converts request format to target provider (OpenAI/Gemini/Anthropic) π€
- Makes API calls to the appropriate backend service π
- Converts responses back to Anthropic format π
- Returns formatted response to client β
The proxy handles both streaming and non-streaming responses, maintains full compatibility with Claude clients, and provides comprehensive error handling and logging.
- Comprehensive error logging with Winston
- Graceful degradation for network issues
- Proper HTTP status codes and error messages
- Request validation with detailed error responses
Beautiful, colorized request logging showing:
- Model mapping (
claude-3-sonnet β gpt-4.1) - Request details (method, endpoint, status)
- Message and tool counts
- Response times
Contributions are welcome! Please feel free to submit a Pull Request.
MIT License - see the original project for details.
Note: This is a Node.js implementation of the original Python Claude Code Proxy. Both versions provide the same functionality with language-specific optimizations.