Skip to content

fix(vertex_ai): Handle missing tokenCount in promptTokensDetails #11581

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Conversation

KingNish24
Copy link
Contributor

Title

This PR is a Solution to the Error converting to a valid response block='tokenCount'. File an issue if litellm error - https://github.com/BerriAI/litellm/issues

It's happening because vertex_ai is not always sending the token count for the audio modality.

Relevant issues

Pre-Submission checklist

Please complete all items before asking a LiteLLM maintainer to review your PR

  • I have Added testing in the tests/litellm/ directory, Adding at least 1 test is a hard requirement - see details
  • I have added a screenshot of my new test passing locally
  • My PR passes all unit tests on make test-unit
  • My PR's scope is as isolated as possible, it only solves 1 specific problem

Type

🐛 Bug Fix

Changes

Replaced direct dictionary access detail["tokenCount"] with the safer .get("tokenCount", 0) method
Added a default value of 0 when the "tokenCount" key is missing
This prevents KeyError exceptions when processing audio tokens where the "tokenCount" field might be missing

This PR is a Solution to the Error converting to a valid response block='tokenCount'. File an issue if litellm error - https://github.com/BerriAI/litellm/issues

It's happening because vertex_ai is not sometimes sending the token count for the audio modality.
@CLAassistant
Copy link

CLA assistant check
Thank you for your submission! We really appreciate it. Like many open source projects, we ask that you sign our Contributor License Agreement before we can accept your contribution.
You have signed the CLA already but the status is still pending? Let us recheck it.

Copy link

vercel bot commented Jun 10, 2025

The latest updates on your projects. Learn more about Vercel for Git ↗︎

Name Status Preview Comments Updated (UTC)
litellm ✅ Ready (Inspect) Visit Preview 💬 Add feedback Jun 10, 2025 9:42am

Copy link
Contributor

@ishaan-jaff ishaan-jaff left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Please can we add a test in tests/test_litellm

@@ -967,17 +967,17 @@ def _calculate_usage(
response_tokens_details = CompletionTokensDetailsWrapper()
for detail in usage_metadata["responseTokensDetails"]:
if detail["modality"] == "TEXT":
response_tokens_details.text_tokens = detail["tokenCount"]
response_tokens_details.text_tokens = detail.get("tokenCount", 0)
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

can you please add a unit test for this in tests_litellm/ @KingNish24

@ishaan-jaff
Copy link
Contributor

I will add the test

@ishaan-jaff ishaan-jaff changed the base branch from main to litellm_fix_missing_token_count_vertex June 19, 2025 20:34
@ishaan-jaff ishaan-jaff merged commit cd028de into BerriAI:litellm_fix_missing_token_count_vertex Jun 19, 2025
5 of 6 checks passed
ishaan-jaff added a commit that referenced this pull request Jun 19, 2025
#11896)

* fix(vertex_ai): Handle missing tokenCount in promptTokensDetails (#11581)

This PR is a Solution to the Error converting to a valid response block='tokenCount'. File an issue if litellm error - https://github.com/BerriAI/litellm/issues

It's happening because vertex_ai is not sometimes sending the token count for the audio modality.

* test_vertex_ai_usage_metadata_missing_token_count

---------

Co-authored-by: Nishith Jain <167524748+KingNish24@users.noreply.github.com>
satendrakumar pushed a commit to satendrakumar/litellm that referenced this pull request Jul 24, 2025
…riAI#11… (BerriAI#11896)

* fix(vertex_ai): Handle missing tokenCount in promptTokensDetails (BerriAI#11581)

This PR is a Solution to the Error converting to a valid response block='tokenCount'. File an issue if litellm error - https://github.com/BerriAI/litellm/issues

It's happening because vertex_ai is not sometimes sending the token count for the audio modality.

* test_vertex_ai_usage_metadata_missing_token_count

---------

Co-authored-by: Nishith Jain <167524748+KingNish24@users.noreply.github.com>
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

4 participants