You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
🤖 fix: correct context usage display for multi-step tool calls
The Context Usage UI was showing inflated cachedInputTokens for plan
messages with multi-step tool calls (e.g., ~150k instead of ~50k).
Root cause: contextUsage was falling back to cumulative usage (summed
across all steps) when contextUsage was undefined. For multi-step
requests, cachedInputTokens gets summed because each step reads from
cache, but the actual context window only sees one step's worth.
Changes:
- Backend: Refactor getStreamMetadata() to fetch totalUsage (for costs)
and contextUsage (last step, for context window) separately from AI SDK
- Backend: Add contextProviderMetadata from streamResult.providerMetadata
for accurate cache creation token display
- Frontend: Remove fallback from contextUsage to usage - only use
contextUsage for context window display
The fix ensures context window shows last step's inputTokens (actual
context size) while cost calculation still uses cumulative totals.
0 commit comments