-
-
Notifications
You must be signed in to change notification settings - Fork 1.1k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
[Feature]: Support Assistant API #2842
Comments
acknowledging this issue @jeromeroussin hoping to have a v0 out for testing by end of week |
This would certainly be awesome, specifically looking for support for:
|
@krrishdholakia Any updates on this? OpenAI Assistants API abstract away the complexity in handling stateful operations required for LLM-based apps including persistent threads, messages, and files. This API unlocks automatic RAG pipelines so developers don't need to develop and manage their own vector-store infrastructure Plus unifying data and LLMs within a single API is an underrated idea which saves developers time and ultimately money |
hey @slavakurilyak @jeromeroussin @taralika we're hoping to have a v0 out for feedback by end of week. |
Hi @taralika @jeromeroussin @slavakurilyak PR is now live. Aiming for the sdk support to be live today. Is this something someone can give me feedback on next week? Help would be appreciated. Next steps - adding azure + proxy support |
Excited to see proxy support VRSEN/agency-swarm#112 |
Hi @krrishdholakia, sorry for replying in a closed issue. I noticed that the current implementation of Assistants API only allows one provider litellm/litellm/assistants/main.py Line 43 in fda3914
Does this mean that we can only use OpenAI's models and cannot use other custom models such as Claude3, Llama3 or even Azure OpenAI's models for now? If that's the case, is there any plan to support Assistants API for various models - like what LiteLLM did for chat/completion APIs - in the future? Thanks! |
I'll be adding a provider that supports other models once datastax/astra-assistants-api#22 is complete |
@RussellLuo how would you suggest we support litellm's completion calls within the assistants api framework? this seems pretty provider specific. I think the next step would be adding the azure endpoints |
@krrishdholakia Indeed, supporting the Assistants API requires a complete backend implementation. Looking forward to the support for Azure endpoints! @phact Looks awesome, thanks for the great work! |
The Feature
The Assistant API comes with new endpoints that liteLLM does not currently support.
Docs:
We'd like to be able to proxy calls to the assistant API with litellm
Motivation, pitch
The assistant API is a quick way to provide "chat-with-your-doc" and codeInterpreter capabilities to an existing chatbot.
Twitter / LinkedIn details
https://www.linkedin.com/in/jeromeroussin/
The text was updated successfully, but these errors were encountered: