-
-
Notifications
You must be signed in to change notification settings - Fork 126
[Platform][LiteLLM] Add Bridge #872
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Conversation
cdb9a18 to
0f4f0e0
Compare
|
Hey, thanks for that, that's actually an interesting one - the feature of litellm is to be a generic bridge to whatever based on the original OpenAI contract - right? So i think we should work on the design shift here more - in case you're up for that. For example the contract got centralized already - and as long as the bridge uses that original openai compliant contract, no additional normalizers need to be registered with the platform. i think the same should apply for some kind generic endpoint handling (model client + result converter) so i think it would be great that we could use generic services (like litellm, openrouter, replicate) that conform to that first openai contract from payload and endpoint handling point of view without even installing a bridge. WDYT? (if that's off-scope for your current context, we can go ahead with this, merge it for now and potentially remove it later in favor of a generalized/centralized solution) |
|
Thank you @welcoMattic. |
This PR adds support for LiteLLM as Bridge
For now, it is a very simple support (only chat), see it as ready-to-evolve bridge.
I've added a litellm service in Docker Compose file of examples to easily test it without having to install LiteLLM locally on your host.