Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

feat(router.py): support fastest response batch completion call #3887

Merged
merged 8 commits into from
May 29, 2024

Conversation

krrishdholakia
Copy link
Contributor

@krrishdholakia krrishdholakia commented May 29, 2024

Title

support fastest response batch completion call. returns fastest response. cancels others.

Screenshot 2024-05-28 at 9 52 35 PM

Relevant issues

User wants to do openai + groq streaming w/ tool calling, and return the first response to users

Type

🆕 New Feature

Changes

  • adds new abatch_completion_fastest_response function to router

[REQUIRED] Testing - Attach a screenshot of any new tests passing locall

If UI changes, send a screenshot/GIF of working UI fixes

Screenshot 2024-05-28 at 7 51 13 PM

returns fastest response. cancels others.
Copy link

vercel bot commented May 29, 2024

The latest updates on your projects. Learn more about Vercel for Git ↗︎

Name Status Preview Comments Updated (UTC)
litellm ✅ Ready (Inspect) Visit Preview 💬 Add feedback May 29, 2024 5:38am

… on proxy

introduces new `fastest_response` flag for enabling the call
@krrishdholakia krrishdholakia merged commit 0114207 into main May 29, 2024
2 of 5 checks passed
@krrishdholakia krrishdholakia deleted the litellm_batch_completions branch May 29, 2024 05:38
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

None yet

1 participant