Skip to content

feat: expose current context tokens in session data#4

Closed
AaronGoldsmith wants to merge 3 commits intoblock:mainfrom
AaronGoldsmith:fix/token-display-context-vs-accumulated
Closed

feat: expose current context tokens in session data#4
AaronGoldsmith wants to merge 3 commits intoblock:mainfrom
AaronGoldsmith:fix/token-display-context-vs-accumulated

Conversation

@AaronGoldsmith
Copy link

Summary

  • The children query now fetches total_tokens (current context window size) alongside accumulated_total_tokens (per-session cumulative API usage)
  • New fields available in the SSE payload: tokens_current per child session, stats.tokens.current for the aggregate
  • No UI changes — the dashboard continues displaying accumulated_total_tokens via the existing tokens field

Context

The goose sessions DB tracks two token metrics per session:

  • total_tokens — current context window size. Resets after compaction.
  • accumulated_total_tokens — running sum of tokens across all API calls within the session. Only grows.

Previously the server only queried accumulated_total_tokens for child sessions. This adds total_tokens to the query so both values are available to the frontend. Confirmed the semantics in block/goose reply_parts.rstotal_tokens is set from the latest API response, while accumulated_total_tokens uses an additive accumulate() helper.

Test plan

  • Verified delegate cards still display accumulated tokens (no UI regression)
  • Verified tokens_current field is populated from DB (checked against sqlite3 query of real sessions)
  • Verified clockworks status bar still shows accumulated_total as before

🤖 Generated with Claude Code

AaronGoldsmith and others added 2 commits March 2, 2026 14:57
The UI was displaying accumulated_total_tokens (sum of input tokens
across every API call) instead of total_tokens (current context window
size). This made sessions appear to use 10-100x more tokens than they
actually had in context — e.g. showing 1M+ tokens for a session whose
context was only ~10K.

Now the per-session cards show current context size, and the clockworks
status bar shows both: "34.5K ctx / 390K tot" so you can see both the
live context and cumulative usage.

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
The children query now fetches total_tokens (current context window size)
in addition to accumulated_total_tokens (per-session cumulative usage).
The new field is available as tokens_current on each child and
stats.tokens.current in the SSE payload. No UI changes — existing
display continues to use accumulated_total_tokens via the tokens field.

Confirmed semantics in block/goose reply_parts.rs: total_tokens is set
from the latest API response (resets after compaction), while
accumulated_total_tokens sums each call's usage onto the session total.

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
Copy link

@chatgpt-codex-connector chatgpt-codex-connector bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

💡 Codex Review

Here are some automated review suggestions for this pull request.

Reviewed commit: 73e6e46ed6

ℹ️ About Codex in GitHub

Your team has set up Codex to review pull requests in this repo. Reviews are triggered when you

  • Open a pull request for review
  • Mark a draft as ready
  • Comment "@codex review".

If Codex has suggestions, it will comment; otherwise it will react with 👍.

Codex can also answer questions or update the PR. Try commenting "@codex address that feedback".

Addresses PR feedback: tokens_current wasn't part of the hash used to
detect tree changes, so SSE events wouldn't fire when only total_tokens
changed (e.g. after context compaction).

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
@AaronGoldsmith
Copy link
Author

Closing - no movement in 2 weeks

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

1 participant