Intelligent multi-provider LLM routing library for Go applications
A high-performance Go library that routes requests across multiple LLM providers (OpenAI, Anthropic, Gemini, DeepSeek, Groq) with intelligent fallback, caching, and cost optimization.
go get github.com/Egham-7/adaptive-proxypackage main
import (
"log"
"os"
"github.com/Egham-7/adaptive-proxy/pkg/config"
)
func main() {
// Configure proxy with OpenAI and Anthropic fallback
builder := config.New().
Port("8080").
AddOpenAICompatibleProvider("openai",
config.NewProviderBuilder(os.Getenv("OPENAI_API_KEY")).Build(),
).
AddAnthropicCompatibleProvider("anthropic",
config.NewProviderBuilder(os.Getenv("ANTHROPIC_API_KEY")).Build(),
)
// Start server
srv := config.NewProxyWithBuilder(builder)
if err := srv.Run(); err != nil {
log.Fatal(err)
}
}Your proxy is now running on http://localhost:8080 with automatic fallback!
- Zero Vendor Lock-in - Switch providers without code changes
- Multi-Provider Support - OpenAI, Anthropic, Groq, DeepSeek, Gemini
- Production-Ready - Redis caching, rate limiting, graceful shutdown
- Type-Safe - Fluent builder API with full Go type safety
Full documentation available in docs/
- Quick Start - Get running in 5 minutes
- Installation - Installation and setup
- Basic Usage - Core concepts and patterns
- Providers - Configure OpenAI, Anthropic, Gemini, and custom providers
- Caching - Redis-backed prompt caching and semantic search
- Routing - Intelligent model selection and cost optimization
- Fallback - Multi-provider fallback strategies
- Middleware - Rate limiting, timeouts, and custom middleware
Once your proxy is running, make requests using the OpenAI-compatible API:
curl http://localhost:8080/v1/chat/completions \
-H "Content-Type: application/json" \
-d '{
"model": "gpt-4",
"messages": [{"role": "user", "content": "Hello!"}]
}'{
"model": "",
"messages": [{ "role": "user", "content": "Complex analysis task" }],
"model_router": {
"cost_bias": 0.3,
"models": [
{ "provider": "openai" },
{ "provider": "anthropic", "model_name": "claude-3-5-sonnet-20241022" }
]
},
"cache": {
"enabled": true,
"semantic_threshold": 0.85
},
"fallback": {
"mode": "sequential"
}
}GET /v1/models- List available modelsGET /health- Health checkPOST /v1/messages- Anthropic-compatible messages endpoint
git clone https://github.com/Egham-7/adaptive-proxy.git
cd adaptive-proxy
cp .env.example .env.local # Add your API keys
go run cmd/api/main.gogofmt -w . && golangci-lint run && go mod tidy && go vet ./... && go test ./...go install github.com/golangci/golangci-lint/cmd/golangci-lint@latestSee CLAUDE.md for detailed architecture documentation and AGENTS.md for AI agent development guidelines.
Contributions welcome! Please ensure all pre-commit checks pass:
gofmt -w . && golangci-lint run && go mod tidy && go vet ./... && go test ./...See LICENSE for details.
- Documentation: docs/
- Issues: GitHub Issues
- Discussions: GitHub Discussions