-
Notifications
You must be signed in to change notification settings - Fork 1k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Add support for Azure, OpenAI, Palm, Anthropic, Cohere, Replicate Models - using litellm #84
base: main
Are you sure you want to change the base?
Conversation
@ShishirPatil can you please take a look at this ? Happy to add docs/tests if this initial commit looks good😊 |
model=model, | ||
messages=question, | ||
n=1, | ||
temperature=0, | ||
) | ||
response = responses['choices'][0]['message']['content'] | ||
elif "claude" in model: |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
litellm manages model I/O for your model choice
Hello @ishaan-jaff, thank you for your PR! It's impressive to see a wrapper that can accommodate multiple models. However, I'm a bit concerned about introducing an additional dependency that might not be essential for standard use cases. While I see its potential value, I'm contemplating whether it's best to include it in a separate release or integrate it into the main releases. I'm flexible on this. Could you initiate a discussion thread titled "Request for Comments - LiteLLM Integration" so we can gather community feedback? Based on that I can review this appropriately. Though at a high level, it looks good, since you basically count on the user to create the |
Hey I would Like to contribute to this PR... |
Thanks for the feedback @ShishirPatil
eg. only install litellm |
Any suggestions where should I start from? and what is open to implement? |
@ShishirPatil @rajveer43 happy to help as well :) |
This PR adds support for models from all the above mentioned providers using litellm https://github.com/BerriAI/litellm
TLDR: gorilla gets:
model
param (Easy to add models in the future, just change the model param)Here's a sample of how it's used: