-
Notifications
You must be signed in to change notification settings - Fork 194
Closed as duplicate of#351
Copy link
Description
Describe the bug
After updating to GitHub Copilot CLI v0.0.344 (commit c87f222), a single interactive prompt is reported as consuming ~33 Premium requests (x 0.33), instead of 1 Premium request (× 0.33x). The same workflow on the previous version worked as expected yesterday.
I did not tried other models.
Affected version
0.0.344 Commit: c87f222
Steps to reproduce the behavior
Launch Copilot CLI interactively (e.g., gh copilot or your usual entrypoint).
Submit one prompt, for example:
Please mask the token that logged on the debug log.
Let Copilot complete the task and show the session usage summary, and view the Copilot usage.
Expected behavior
For a single user prompt haiku-4.5, should consumes 1 Premium request × 0.33x model multiplier, not dozens.
Additional context
Session Output (redacted)
Welcome to GitHub Copilot CLI
Version 0.0.344 · Commit c87f222
...
● Logged in with gh as user: <username>
● Connected to GitHub MCP Server
> Please mask the token that logged on the debug log.
...
● Total usage est: 31 Premium requests
Total duration (API): 1m 36.1s
Total duration (wall): 27m 22.4s
Total code changes: 112 lines added, 4 lines removed
Usage by model:
unknown 0 input, 1.0m output, 0 cache read, 0 cache write (Est. 31 Premium requests)
...
claude-haiku-4.5 (0.33x)
Note: Yesterday’s version did not overcount.
karahanharunn
Metadata
Metadata
Assignees
Labels
No labels