Skip to content

Conversation

@ethanndickson
Copy link
Member

@ethanndickson ethanndickson commented Dec 2, 2025

Problem

When a stream is interrupted (e.g., by a queued message, user cancellation, or starting a new message), the usage data (breakdown by type and context window) resets to 0.

Root Causes

1. AI SDK's totalUsage returns zeros on abort

In cancelStreamSafely(), we called getStreamMetadata() which awaits the AI SDK's totalUsage promise. When a stream is aborted mid-execution, this promise resolves with zeros (not undefined), so our fallback logic (totalUsage ?? cumulativeUsage) never triggered.

2. usageStore cache not invalidated on stream-start

The MapStore caches computed usage state. When a new stream starts after an abort:

  • stream-abort bumps usageStore
  • stream-start only bumped states, NOT usageStore
  • The stale cached value showed liveUsage as undefined

Solution

  1. Use tracked cumulativeUsage directly instead of AI SDK's unreliable totalUsage on abort. This is updated on each finish-step event (before tool execution), so it has accurate data even when interrupted mid-tool-call.

  2. Bump usageStore on stream-start to invalidate the cache when a new stream begins.

Trade-off

Usage from the interrupted step is abandoned. The AI SDK's finish-step event fires before tool execution begins, so if we abort during tool execution, that step's usage was already recorded. However, if we abort during the model's response generation (before finish-step), that partial step's usage is lost. This is acceptable since:

  • We can't get reliable data from the SDK for interrupted steps
  • Cumulative usage from all completed steps is preserved
  • The alternative (zeros everywhere) is worse

Generated with mux

Two bugs caused usage/cost data to reset when a queued message was sent:

1. AI SDK's totalUsage returns zeros on abort - now use tracked cumulativeUsage
2. usageStore cache not invalidated on stream-start - now bump on new stream

Usage from the interrupted step is abandoned since finish-step fires before
tool execution, but cumulative usage from completed steps is preserved.
@ethanndickson ethanndickson added this pull request to the merge queue Dec 2, 2025
github-merge-queue bot pushed a commit that referenced this pull request Dec 2, 2025
## Problem

When a message is queued during streaming and then sent at
`tool-call-end` (which interrupts the current stream), the usage data
(breakdown by type and context window) resets to 0.

### Root Causes

**1. AI SDK's `totalUsage` returns zeros on abort**

In `cancelStreamSafely()`, we called `getStreamMetadata()` which awaits
the AI SDK's `totalUsage` promise. When a stream is aborted
mid-execution, this promise resolves with **zeros** (not `undefined`),
so our fallback logic (`totalUsage ?? cumulativeUsage`) never triggered.

**2. `usageStore` cache not invalidated on `stream-start`**

The `MapStore` caches computed usage state. When a new stream starts
after an abort:
- `stream-abort` bumps `usageStore` ✓
- `stream-start` only bumped `states`, NOT `usageStore` ✗
- The stale cached value showed `liveUsage` as undefined

## Solution

1. **Use tracked `cumulativeUsage` directly** instead of AI SDK's
unreliable `totalUsage` on abort. This is updated on each `finish-step`
event (before tool execution), so it has accurate data even when
interrupted mid-tool-call.

2. **Bump `usageStore` on `stream-start`** to invalidate the cache when
a new stream begins.

### Trade-off

Usage from the **interrupted step** is abandoned. The AI SDK's
`finish-step` event fires *before* tool execution begins, so if we abort
during tool execution, that step's usage was already recorded. However,
if we abort during the model's response generation (before
`finish-step`), that partial step's usage is lost. This is acceptable
since:
- We can't get reliable data from the SDK for interrupted steps
- Cumulative usage from all *completed* steps is preserved
- The alternative (zeros everywhere) is worse

---
_Generated with `mux`_
@ethanndickson ethanndickson removed this pull request from the merge queue due to a manual request Dec 2, 2025
@ethanndickson ethanndickson changed the title 🤖 fix: preserve usage data when queued message interrupts stream 🤖 fix: preserve usage data when stream is interrupted Dec 2, 2025
@ethanndickson ethanndickson added this pull request to the merge queue Dec 2, 2025
Merged via the queue into main with commit e1117fc Dec 2, 2025
13 checks passed
@ethanndickson ethanndickson deleted the fix-queued-message-usage-reset branch December 2, 2025 06:47
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

1 participant