Skip to content

fix(ai): DeepInfra completion tokens calculation for gemini models#12214

Closed
sylviezhang37 wants to merge 3 commits intomainfrom
fix/deepinfra-gemini-tokens
Closed

fix(ai): DeepInfra completion tokens calculation for gemini models#12214
sylviezhang37 wants to merge 3 commits intomainfrom
fix/deepinfra-gemini-tokens

Conversation

@sylviezhang37
Copy link
Copy Markdown
Contributor

Background

We noticed in ai-gateway logs that Deepinfra's gemini models' output tokens are negative.

Summary

Deepinfra's issue with how they convert gemini response to openai-compat, this is a raw output from them where completion tokens < reasoning tokens. For openai-compat, completion tokens should include reasoning tokens.

  "prompt_tokens": 19,
  "completion_tokens": 84,
  "total_tokens": 1184,
  "prompt_tokens_details": null,
  "completion_tokens_details": {
    "reasoning_tokens": 1081
  }

They use vllm, and the issue should be here where they don't treat complete_token_details.

Added a conditional logic to override their calculation issues for the models we are seeing issues for but keeping the other models unchanged.

Manual Verification

  • Verified with streaming and non-streaming tests on both gemini models with Deepinfra
  • Verified with additional unit tests

Checklist

  • Tests have been added / updated (for bug fixes / features)
  • Documentation has been added / updated (for bug fixes / features)
  • A patch changeset for relevant packages has been added (for bug fixes / features - run pnpm changeset in the project root)
  • I have reviewed this pull request (self-review)

Future Work

A PR is open on vllm related to reasoning tokens calculation, but it is making progress very slowly, I subscribed to it. If reasoning tokens are incorporated by vllm, I can revert the code changes here.

@vercel-ai-sdk vercel-ai-sdk bot added ai/provider related to a provider package. Must be assigned together with at least one `provider/*` label bug Something isn't working as documented provider/black-forest-labs Issues related to the @ai-sdk/black-forst-labs provider. https://bfl.ai labels Feb 3, 2026
@sylviezhang37
Copy link
Copy Markdown
Contributor Author

sylviezhang37 commented Feb 3, 2026

Closing this and passing this issue to ai sdk team

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

ai/provider related to a provider package. Must be assigned together with at least one `provider/*` label bug Something isn't working as documented provider/black-forest-labs Issues related to the @ai-sdk/black-forst-labs provider. https://bfl.ai

Projects

None yet

Development

Successfully merging this pull request may close these issues.

1 participant