Skip to content

fix: HTTP verbose logging and anthropic provider usage fallback (#211)#212

Merged
konard merged 8 commits intomainfrom
issue-211-ba03f381fd30
Mar 9, 2026
Merged

fix: HTTP verbose logging and anthropic provider usage fallback (#211)#212
konard merged 8 commits intomainfrom
issue-211-ba03f381fd30

Conversation

@konard
Copy link
Contributor

@konard konard commented Mar 9, 2026

Summary

Fixes two issues reported in #211:

  1. HTTP request/response logging not appearing in --verbose mode: The lazy log callback pattern (log.info(() => ({...}))) passed through the log-lazy npm package, adding indirection that could lose output when the CLI runs as a subprocess. Changed all 5 HTTP log call sites to use direct calls (log.info('msg', data)) since the verbose check is already done at the top of the wrapper.

  2. "Provider returned zero tokens with unknown finish reason" error: When using opencode provider with @ai-sdk/anthropic SDK, the standard AI SDK usage object is empty but providerMetadata.anthropic.usage contains valid token data (snake_case keys: input_tokens, output_tokens). Added an anthropic metadata fallback in getUsage() to extract tokens from this metadata, similar to the existing OpenRouter fallback.

Root Cause Analysis

See full case study: docs/case-studies/issue-211/README.md

Issue 1: HTTP Logging

  • Location: js/src/provider/provider.ts (verbose fetch wrapper)
  • The lazy callback chain: log.info(callback)lazyLogInstance.info(wrappedCallback)log-lazy bit flag check → console.log()
  • Fix: Use direct log.info('message', {data}) which calls output() synchronously

Issue 2: Empty Token Usage

  • Location: js/src/session/index.ts (getUsage() function)
  • The opencode/minimax-m2.5-free model uses @ai-sdk/anthropic SDK with custom baseURL
  • API returns usage in providerMetadata.anthropic.usage (snake_case) but standard usage is empty
  • Fix: Added anthropic metadata fallback, extended empty check to cover zero-valued tokens

Changes

File Change
js/src/provider/provider.ts 5 log calls: lazy → direct
js/src/session/index.ts Added anthropic metadata fallback in getUsage()
js/tests/session-usage.test.ts 7 new test cases for anthropic fallback
js/.changeset/fix-http-verbose-and-anthropic-usage.md Changeset for patch release
docs/case-studies/issue-211/ Case study with log analysis and root causes
experiments/ Investigation scripts
js/package.json Version bump to 0.16.15

Test plan

  • All 136 tests pass (session-usage, verbose-http-logging, provider, log-lazy)
  • 7 new tests for anthropic metadata fallback pass
  • ESLint: no errors
  • Prettier: all files formatted correctly
  • File size check: all files within limit
  • Changeset validation: passed
  • CI: all checks passing (changeset, lint, tests on ubuntu/macos/windows)

🤖 Generated with Claude Code

Fixes #211

Adding .gitkeep for PR creation (default mode).
This file will be removed when the task is complete.

Issue: #211
@konard konard self-assigned this Mar 9, 2026
konard and others added 5 commits March 9, 2026 16:46
…lbacks (#211)

The lazy log callback pattern (log.info(() => ({...}))) passed through the
log-lazy npm package, adding indirection that could lose output when the CLI
runs as a subprocess. Since the verbose check is already done at the top of
the wrapper, lazy evaluation provides no benefit here.

Changed all 5 HTTP log call sites to use direct calls (log.info('msg', data))
which write to stdout synchronously via the output() function.

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
When using opencode provider with @ai-sdk/anthropic SDK, the standard AI SDK
usage object is empty but providerMetadata.anthropic.usage contains valid
token data with snake_case keys (input_tokens, output_tokens).

Added fallback in getUsage() to extract tokens from anthropic metadata when
standard usage is empty. Also extended the empty check to cover providers
that set tokens to 0 instead of undefined.

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
Added 7 test cases covering:
- Fallback to anthropic metadata when standard usage has undefined tokens
- Fallback when standard usage has zero tokens
- Cache read tokens extraction from anthropic metadata
- Preference for standard usage when it has valid data
- Graceful handling when anthropic metadata is missing
- Preference for openrouter metadata over anthropic metadata

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
Case study in docs/case-studies/issue-211/ includes:
- Full session log (verbose-session-log.txt)
- Root cause analysis for both HTTP logging and usage data issues
- Timeline of events and comparison with upstream projects
- Experiment scripts used during investigation

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
@konard konard changed the title [WIP] We were not printing all raw http requests and responses, and was not able to finish execution of task using Agent CLI fix: HTTP verbose logging and anthropic provider usage fallback (#211) Mar 9, 2026
@konard konard marked this pull request as ready for review March 9, 2026 16:47
@konard
Copy link
Contributor Author

konard commented Mar 9, 2026

🤖 Solution Draft Log

This log file contains the complete execution trace of the AI solution draft process.

💰 Cost estimation:

  • Public pricing estimate: $15.391314
  • Calculated by Anthropic: $10.856364 USD
  • Difference: $-4.534950 (-29.46%)
    📎 Log file uploaded as Gist (6020KB)
    🔗 View complete solution draft log

Now working session is ended, feel free to review and add any feedback on the solution draft.

@konard
Copy link
Contributor Author

konard commented Mar 9, 2026

🔄 Auto-restart 1/3

Detected uncommitted changes from previous run. Starting new session to review and commit them.

Uncommitted files:

?? gist-log.txt
?? models-dev-opencode.txt

Auto-restart will stop after changes are committed or after 2 more iterations. Please wait until working session will end and give your feedback.

…#211)

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
@konard
Copy link
Contributor Author

konard commented Mar 9, 2026

🔄 Auto-restart 1/3 Log

This log file contains the complete execution trace of the AI solution draft process.

💰 Cost estimation:

  • Public pricing estimate: $3.169879
  • Calculated by Anthropic: $2.043547 USD
  • Difference: $-1.126332 (-35.53%)
    📎 Log file uploaded as Gist (7880KB)
    🔗 View complete solution draft log

Now working session is ended, feel free to review and add any feedback on the solution draft.

@konard
Copy link
Contributor Author

konard commented Mar 9, 2026

✅ Ready to merge

This pull request is now ready to be merged:

  • All CI checks have passed
  • No merge conflicts
  • No pending changes

Monitored by hive-mind with --auto-restart-until-mergeable flag

@konard konard merged commit 099373c into main Mar 9, 2026
8 checks passed
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

We were not printing all raw http requests and responses, and was not able to finish execution of task using Agent CLI

1 participant