-
-
Notifications
You must be signed in to change notification settings - Fork 3.8k
fix(vertex_ai): Handle missing tokenCount in promptTokensDetails #11581
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
fix(vertex_ai): Handle missing tokenCount in promptTokensDetails #11581
Conversation
This PR is a Solution to the Error converting to a valid response block='tokenCount'. File an issue if litellm error - https://github.com/BerriAI/litellm/issues It's happening because vertex_ai is not sometimes sending the token count for the audio modality.
|
The latest updates on your projects. Learn more about Vercel for Git ↗︎
|
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Please can we add a test in tests/test_litellm
@@ -967,17 +967,17 @@ def _calculate_usage( | |||
response_tokens_details = CompletionTokensDetailsWrapper() | |||
for detail in usage_metadata["responseTokensDetails"]: | |||
if detail["modality"] == "TEXT": | |||
response_tokens_details.text_tokens = detail["tokenCount"] | |||
response_tokens_details.text_tokens = detail.get("tokenCount", 0) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
can you please add a unit test for this in tests_litellm/
@KingNish24
I will add the test |
cd028de
into
BerriAI:litellm_fix_missing_token_count_vertex
#11896) * fix(vertex_ai): Handle missing tokenCount in promptTokensDetails (#11581) This PR is a Solution to the Error converting to a valid response block='tokenCount'. File an issue if litellm error - https://github.com/BerriAI/litellm/issues It's happening because vertex_ai is not sometimes sending the token count for the audio modality. * test_vertex_ai_usage_metadata_missing_token_count --------- Co-authored-by: Nishith Jain <167524748+KingNish24@users.noreply.github.com>
…riAI#11… (BerriAI#11896) * fix(vertex_ai): Handle missing tokenCount in promptTokensDetails (BerriAI#11581) This PR is a Solution to the Error converting to a valid response block='tokenCount'. File an issue if litellm error - https://github.com/BerriAI/litellm/issues It's happening because vertex_ai is not sometimes sending the token count for the audio modality. * test_vertex_ai_usage_metadata_missing_token_count --------- Co-authored-by: Nishith Jain <167524748+KingNish24@users.noreply.github.com>
Title
This PR is a Solution to the Error converting to a valid response block='tokenCount'. File an issue if litellm error - https://github.com/BerriAI/litellm/issues
It's happening because vertex_ai is not always sending the token count for the audio modality.
Relevant issues
Pre-Submission checklist
Please complete all items before asking a LiteLLM maintainer to review your PR
tests/litellm/
directory, Adding at least 1 test is a hard requirement - see detailsmake test-unit
Type
🐛 Bug Fix
Changes
Replaced direct dictionary access detail["tokenCount"] with the safer .get("tokenCount", 0) method
Added a default value of 0 when the "tokenCount" key is missing
This prevents KeyError exceptions when processing audio tokens where the "tokenCount" field might be missing