-
Notifications
You must be signed in to change notification settings - Fork 13.6k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Add OpenLM LLM multi-provider #4993
Conversation
thanks @r2d4! could we add a simple example notebook (a la docs/modules/models/llms/integrations/openai.ipynb) |
@@ -90,7 +90,7 @@ pandas = {version = "^2.0.1", optional = true} | |||
telethon = {version = "^1.28.5", optional = true} | |||
zep-python = {version="^0.25", optional=true} | |||
chardet = {version="^5.1.0", optional=true} | |||
|
|||
openlm = {version = "^0.0.5", optional = true} |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
if this isn't added to one of the extras it'll be installed automatically. could we either delete or add to "all" (line 186)
OpenLM is a zero-dependency OpenAI-compatible LLM provider that can call different inference endpoints directly via HTTP. It implements the OpenAI Completion class so that it can be used as a drop-in replacement for the OpenAI API. This changeset utilizes BaseOpenAI for minimal added code.
Thanks for the feedback @dev2049
Let me know if there are any other changes! |
OpenLM is a zero-dependency OpenAI-compatible LLM provider that can call different inference endpoints directly via HTTP. It implements the OpenAI Completion class so that it can be used as a drop-in replacement for the OpenAI API. This changeset utilizes BaseOpenAI for minimal added code.