Skip to content
#

llm-proxy

Here are 7 public repositories matching this topic...

A personal LLM gateway with fault-tolerant capabilities for calls to LLM models from any provider with OpenAI-compatible APIs. Advanced features like retry, model sequencing, and body parameter injection are also available. Especially useful to work with AI coders like Cline and RooCode and providers like OpenRouter.

  • Updated Jun 3, 2025
  • Python

Improve this page

Add a description, image, and links to the llm-proxy topic page so that developers can more easily learn about it.

Curate this topic

Add this topic to your repo

To associate your repository with the llm-proxy topic, visit your repo's landing page and select "manage topics."

Learn more