Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

add optional custom_llm_provider param for litellm #221

Merged

Conversation

entropi
Copy link
Contributor

@entropi entropi commented May 22, 2024

We are using the litellm proxy, but still use the litellm magentic backend rather than openai with a custom base in order to pass through custom metadata.

Adding this pass through parameter allows one, for example, to set the custom_llm_provider to openai in order to use non open ai models like Claude through the proxy. Without it, when you try to use a specify a non-openai model the litellm backend tends to error on missing API keys.

@jackmpcollins jackmpcollins merged commit 04a4ffc into jackmpcollins:main May 23, 2024
1 check passed
@jackmpcollins
Copy link
Owner

Thanks! Will release this shortly

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants