Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[Feature]: Support Assistant API #2842

Closed
jeromeroussin opened this issue Apr 4, 2024 · 10 comments · Fixed by #3455
Closed

[Feature]: Support Assistant API #2842

jeromeroussin opened this issue Apr 4, 2024 · 10 comments · Fixed by #3455
Labels
enhancement New feature or request

Comments

@jeromeroussin
Copy link

The Feature

The Assistant API comes with new endpoints that liteLLM does not currently support.
Docs:

We'd like to be able to proxy calls to the assistant API with litellm

Motivation, pitch

The assistant API is a quick way to provide "chat-with-your-doc" and codeInterpreter capabilities to an existing chatbot.

Twitter / LinkedIn details

https://www.linkedin.com/in/jeromeroussin/

@jeromeroussin jeromeroussin added the enhancement New feature or request label Apr 4, 2024
@krrishdholakia
Copy link
Contributor

acknowledging this issue @jeromeroussin

hoping to have a v0 out for testing by end of week

@taralika
Copy link
Contributor

This would certainly be awesome, specifically looking for support for:

  • /openai/assistants
  • /openai/threads
  • /openai/threads/${thread-id}/messages
  • /openai/threads/${thread-id}/runs
  • /openai/threads/${thread-id}/runs/${run_id}

@slavakurilyak
Copy link

@krrishdholakia Any updates on this?

OpenAI Assistants API abstract away the complexity in handling stateful operations required for LLM-based apps including persistent threads, messages, and files.

This API unlocks automatic RAG pipelines so developers don't need to develop and manage their own vector-store infrastructure

Plus unifying data and LLMs within a single API is an underrated idea which saves developers time and ultimately money

@krrishdholakia
Copy link
Contributor

hey @slavakurilyak @jeromeroussin @taralika we're hoping to have a v0 out for feedback by end of week.

@krrishdholakia
Copy link
Contributor

Hi @taralika @jeromeroussin @slavakurilyak PR is now live. Aiming for the sdk support to be live today.

Is this something someone can give me feedback on next week? Help would be appreciated.

Next steps - adding azure + proxy support

@slavakurilyak
Copy link

Excited to see proxy support VRSEN/agency-swarm#112

@RussellLuo
Copy link

Hi @krrishdholakia, sorry for replying in a closed issue.

I noticed that the current implementation of Assistants API only allows one provider openai:

if custom_llm_provider == "openai":

Does this mean that we can only use OpenAI's models and cannot use other custom models such as Claude3, Llama3 or even Azure OpenAI's models for now?

If that's the case, is there any plan to support Assistants API for various models - like what LiteLLM did for chat/completion APIs - in the future?

Thanks!

@phact
Copy link
Contributor

phact commented May 24, 2024

I'll be adding a provider that supports other models once datastax/astra-assistants-api#22 is complete

@krrishdholakia
Copy link
Contributor

@RussellLuo how would you suggest we support litellm's completion calls within the assistants api framework? this seems pretty provider specific.

I think the next step would be adding the azure endpoints

@RussellLuo
Copy link

@krrishdholakia Indeed, supporting the Assistants API requires a complete backend implementation. Looking forward to the support for Azure endpoints!

@phact Looks awesome, thanks for the great work!

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
enhancement New feature or request
Projects
None yet
Development

Successfully merging a pull request may close this issue.

6 participants