Connect any LLM-powered client app, such as a coding agent, to any supported inference backend/model.
proxy gemini-api openai-api llm claude-ai claude-api openrouter llm-proxy litellm deepseek google-vertex-api gemini-cli anthropic-api llm-agentic-ai grok-api qwen-coder claude-code qwen3 litellm-ai-gateway claude-proxy
-
Updated
Sep 12, 2025 - Python