Skip to content

small-tou/claude-code-proxy-js

Folders and files

NameName
Last commit message
Last commit date

Latest commit

Β 

History

2 Commits
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 

Repository files navigation

Claude Code Proxy - Next.js Edition πŸš€

A modern web-based configuration interface for the Claude Code Proxy server with database-driven configuration management. 🌐

A Next.js application with a modern web interface for managing your Claude Code Proxy server. Store configurations in a database, manage API keys securely, and control the proxy server directly from your browser. Built with React, TypeScript, Prisma, and SQLite for a complete configuration management solution.

image

Features ✨

  • 🌐 Modern Web Interface: Built with Next.js and React for easy configuration management
  • πŸ—„οΈ Database Storage: Configuration settings stored in SQLite database with Prisma ORM
  • πŸ”„ Dynamic Configuration: Change settings without restarting the proxy server
  • πŸ”‘ API Key Management: Securely store and manage API keys for different providers
  • 🎯 Model Mapping: Configure how Claude models map to other provider models
  • ▢️ Server Control: Start/stop proxy server directly from the web interface
  • πŸ›‘οΈ Type Safety: Full TypeScript support for better development experience
  • API Translation: Converts between Anthropic API format and OpenAI/Gemini formats
  • Streaming Support: Full support for Server-Sent Events (SSE) streaming responses
  • Function Calling: Complete support for tool/function calling
  • Multiple Providers: Support for OpenAI, Gemini, and Anthropic backends

Quick Start ⚑

Prerequisites

  • Node.js 18+
  • npm or yarn package manager

1. Install Dependencies

npm install

2. Set up Database

# Generate Prisma client and create database
npx prisma generate
npx prisma db push

3. Start Development Server

npm run dev

The web interface will be available at http://localhost:3000

4. Configure API Keys

  1. Open http://localhost:3000 in your browser
  2. Navigate to the "API Keys" tab
  3. Add your API keys for the providers you want to use:
    • OpenAI: Add your OpenAI API key
    • Google: Add your Google AI API key for Gemini
    • Anthropic: Add your Anthropic API key for Claude

5. Configure Model Mappings

  1. Go to the "Model Mappings" tab
  2. Configure how Claude models should map to other provider models
  3. Choose the target provider and model for each Claude model

6. Start Proxy Server

  1. Click the "Start Proxy" button in the web interface
  2. The proxy server will start on port 8082 (configurable)
  3. You can now use it with Claude Code:
ANTHROPIC_BASE_URL=http://localhost:8082 claude

Configuration πŸ”§

Environment Variables

Variable Description Default Required
OPENAI_API_KEY OpenAI API key - If using OpenAI
GEMINI_API_KEY Google AI Studio API key - If using Gemini
ANTHROPIC_API_KEY Anthropic API key - If using Anthropic
PREFERRED_PROVIDER Primary provider (openai, google, anthropic) openai No
BIG_MODEL Model for Sonnet mapping gpt-4.1 No
SMALL_MODEL Model for Haiku mapping gpt-4.1-mini No
OPENAI_BASE_URL Custom OpenAI endpoint - No
PORT Server port 8082 No
HOST Server host 0.0.0.0 No
LOG_LEVEL Logging level warn No

Model Mapping πŸ—ΊοΈ

The proxy automatically maps Claude models based on your configuration:

Claude Model Default Mapping When using Gemini provider
claude-3-haiku openai/gpt-4.1-mini gemini/[SMALL_MODEL]
claude-3-sonnet openai/gpt-4.1 gemini/[BIG_MODEL]

Supported Models

OpenAI Models

  • o3-mini, o1, o1-mini, o1-pro
  • gpt-4.5-preview, gpt-4o, gpt-4o-mini
  • chatgpt-4o-latest, gpt-4o-audio-preview
  • gpt-4.1, gpt-4.1-mini

Gemini Models

  • gemini-2.5-flash
  • gemini-2.5-pro

Docker Deployment 🐳

Using Docker Compose (Recommended)

  1. Create .env file with your configuration
  2. Start the service:
    docker-compose up -d

Using Docker directly

# Build the image
docker build -t claude-proxy-nodejs .

# Run the container
docker run -d \
  --name claude-proxy \
  --env-file .env \
  -p 8082:8082 \
  claude-proxy-nodejs

API Endpoints πŸ“‘

Messages

  • POST /v1/messages - Create a message (supports streaming)
  • POST /v1/messages/count_tokens - Count tokens in a request

Health Check

  • GET / - Health check and service info

Architecture πŸ—οΈ

nodejs-version/
β”œβ”€β”€ src/
β”‚   β”œβ”€β”€ config/           # Configuration management
β”‚   β”œβ”€β”€ middleware/       # Express middleware
β”‚   β”œβ”€β”€ models/          # Zod schemas for validation
β”‚   β”œβ”€β”€ routes/          # API route handlers
β”‚   β”œβ”€β”€ services/        # Core business logic
β”‚   β”‚   β”œβ”€β”€ converter.js    # Format conversion
β”‚   β”‚   β”œβ”€β”€ litellmClient.js # API client
β”‚   β”‚   β”œβ”€β”€ modelMapper.js  # Model mapping
β”‚   β”‚   └── streamHandler.js # Streaming support
β”‚   β”œβ”€β”€ utils/           # Utilities (logging, etc.)
β”‚   └── server.js        # Main application entry
β”œβ”€β”€ package.json
β”œβ”€β”€ Dockerfile
└── docker-compose.yml

Development πŸ‘¨β€πŸ’»

# Install dependencies
npm install

# Start in development mode (with auto-reload)
npm run dev

# Run linting
npm run lint

# Format code
npm run format

How It Works 🧩

  1. Receives requests in Anthropic's API format πŸ“₯
  2. Validates and maps models according to configuration πŸ”„
  3. Converts request format to target provider (OpenAI/Gemini/Anthropic) πŸ“€
  4. Makes API calls to the appropriate backend service 🌐
  5. Converts responses back to Anthropic format πŸ”„
  6. Returns formatted response to client βœ…

The proxy handles both streaming and non-streaming responses, maintains full compatibility with Claude clients, and provides comprehensive error handling and logging.

Error Handling πŸ›‘οΈ

  • Comprehensive error logging with Winston
  • Graceful degradation for network issues
  • Proper HTTP status codes and error messages
  • Request validation with detailed error responses

Logging πŸ“Š

Beautiful, colorized request logging showing:

  • Model mapping (claude-3-sonnet β†’ gpt-4.1)
  • Request details (method, endpoint, status)
  • Message and tool counts
  • Response times

Contributing 🀝

Contributions are welcome! Please feel free to submit a Pull Request.

License πŸ“„

MIT License - see the original project for details.


Note: This is a Node.js implementation of the original Python Claude Code Proxy. Both versions provide the same functionality with language-specific optimizations.

About

No description, website, or topics provided.

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published