Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Add support for Azure, OpenAI, Palm, Anthropic, Cohere, Replicate Models - using litellm #84

Open
wants to merge 1 commit into
base: main
Choose a base branch
from

Conversation

ishaan-jaff
Copy link

@ishaan-jaff ishaan-jaff commented Aug 6, 2023

This PR adds support for models from all the above mentioned providers using litellm https://github.com/BerriAI/litellm

TLDR: gorilla gets:

Here's a sample of how it's used:

from litellm import completion

## set ENV variables
# ENV variables can be set in .env file, too. Example in .env.example
os.environ["OPENAI_API_KEY"] = "openai key"
os.environ["COHERE_API_KEY"] = "cohere key"

messages = [{ "content": "Hello, how are you?","role": "user"}]

# openai call
response = completion(model="gpt-3.5-turbo", messages=messages)

# cohere call
response = completion("command-nightly", messages)

# anthropic call
response = completion(model="claude-instant-1", messages=messages)

@ishaan-jaff
Copy link
Author

ishaan-jaff commented Aug 6, 2023

@ShishirPatil can you please take a look at this ? Happy to add docs/tests if this initial commit looks good😊

model=model,
messages=question,
n=1,
temperature=0,
)
response = responses['choices'][0]['message']['content']
elif "claude" in model:
Copy link
Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

litellm manages model I/O for your model choice

@ShishirPatil
Copy link
Owner

Hello @ishaan-jaff, thank you for your PR! It's impressive to see a wrapper that can accommodate multiple models. However, I'm a bit concerned about introducing an additional dependency that might not be essential for standard use cases. While I see its potential value, I'm contemplating whether it's best to include it in a separate release or integrate it into the main releases. I'm flexible on this. Could you initiate a discussion thread titled "Request for Comments - LiteLLM Integration" so we can gather community feedback? Based on that I can review this appropriately. Though at a high level, it looks good, since you basically count on the user to create the message.
Thoughts?

@rajveer43
Copy link
Contributor

Hey I would Like to contribute to this PR...

@ishaan-jaff
Copy link
Author

Thanks for the feedback @ShishirPatil

eg. only install litellm anthropic if a user is trying to use claude-2

@rajveer43
Copy link
Contributor

Any suggestions where should I start from? and what is open to implement?

@krrishdholakia
Copy link

@ShishirPatil @rajveer43 happy to help as well :)

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

4 participants