-
-
Notifications
You must be signed in to change notification settings - Fork 1.1k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Clarifai-LiteLLM : Added clarifai as LLM Provider. #3369
Conversation
* intg v1 clarifai-litellm * Added more community models and testcase * Clarifai-updated markdown docs
The latest updates on your projects. Learn more about Vercel for Git ↗︎
|
Thanks for this PR @mogith-pn. Happy to add support. Blockers
Once completed, if you can share a screenshot of this passing testing for you - that would be great! |
Hi @krrishdholakia , |
For async completions - it's just a call with our async http handler see anthropic - litellm/litellm/llms/anthropic.py Line 258 in 3d92876
For streaming - if your backend server doesn't support streaming, then make a normal completion/async completion call and wrap it in an iterator - litellm/litellm/llms/anthropic.py Line 211 in 3d92876
This will make sure people's calls don't break in prod. |
I don't think your PR uses the clarifai sdk. If i missed it - can you please use the HTTP endpoints and our httpx clients instead - e.g. litellm/litellm/llms/anthropic.py Line 258 in 3d92876
This will keep the package light, and let people switch between providers easily. |
Hi @krrishdholakia , |
Hey @krrishdholakia , |
Great! planning on merging after we have a stable release out later today 🚀 Curious - how're you using litellm today? |
|
Objectives