Skip to content

[Feat] Return response_id == upstream response ID for VertexAI + Google AI studio (Stream+Non stream) #11456

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Merged
merged 5 commits into from
Jun 6, 2025

Conversation

ishaan-jaff
Copy link
Contributor

@ishaan-jaff ishaan-jaff commented Jun 5, 2025

[Feat] Return response_id == upstream response ID for VertexAI + Google AI studio (Stream+Non stream)

Returns the upstream response ID from Vertex AI and Google AI Studio for both streaming and non-streaming responses.

Key changes

  • stream + non-stream response ids from litellm will now be the same as the upstream api response IDs
  • we follow this pattern for openai, azure and are working towards following this standard for all providers.

Relevant issues

Pre-Submission checklist

Please complete all items before asking a LiteLLM maintainer to review your PR

  • I have Added testing in the tests/litellm/ directory, Adding at least 1 test is a hard requirement - see details
  • I have added a screenshot of my new test passing locally
  • My PR passes all unit tests on make test-unit
  • My PR's scope is as isolated as possible, it only solves 1 specific problem

Type

🆕 New Feature
✅ Test

Changes

Copy link

vercel bot commented Jun 5, 2025

The latest updates on your projects. Learn more about Vercel for Git ↗︎

Name Status Preview Comments Updated (UTC)
litellm ✅ Ready (Inspect) Visit Preview 💬 Add feedback Jun 6, 2025 0:03am

@ishaan-jaff ishaan-jaff requested a review from Copilot June 5, 2025 22:45
Copy link
Contributor

@Copilot Copilot AI left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Pull Request Overview

This PR implements a new feature that returns the upstream response ID from Vertex AI and Google AI Studio for both streaming and non-streaming responses.

  • A new test has been added to verify that the response ID is preserved.
  • The transformation logic in the Vertex AI Gemini module has been updated to extract and assign the response ID from the API responses.
  • Minor formatting adjustments have been made in supporting files, including the streaming handler.

Reviewed Changes

Copilot reviewed 3 out of 3 changed files in this pull request and generated no comments.

File Description
tests/test_litellm/llms/vertex_ai/gemini/test_vertex_and_google_ai_studio_gemini.py Adds a test verifying proper response ID assignment for non-streaming responses
litellm/llms/vertex_ai/gemini/vertex_and_google_ai_studio_gemini.py Updates response transformation to extract and assign the upstream response ID
litellm/litellm_core_utils/streaming_handler.py Adjusts formatting and removes a fallback assignment regarding the response ID
Comments suppressed due to low confidence (1)

litellm/litellm_core_utils/streaming_handler.py:621

  • Consider verifying that the removal of the fallback assignment for self.response_id is intentional, ensuring that the response ID is properly propagated in all relevant scenarios.
else:
            self.response_id = model_response.id  # type: ignore

@ishaan-jaff ishaan-jaff merged commit f0cb80e into main Jun 6, 2025
42 of 46 checks passed
stefan-- pushed a commit to stefan--/litellm that referenced this pull request Jun 12, 2025
…le AI studio (Stream+Non stream) (BerriAI#11456)

* fix: vertexAI return responseID

* fix: vertexAI return responseID

* test_vertex_ai_response_id

* test: test_vertex_ai_streaming_response_id

* test_vertex_ai_streaming_response_id
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

1 participant