Skip to content

return usage from Anthropic OpenAI adapter #6238

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Draft
wants to merge 9 commits into
base: main
Choose a base branch
from

Conversation

sestinj
Copy link
Contributor

@sestinj sestinj commented Jun 21, 2025

Description

Anthropic support for returning usage data from openai adapters (chat stream only for now). Includes a test for usage data to be returned, that we can incrementally add for other providers as we support

Checklist

  • I've read the contributing guide
  • The relevant docs, if any, have been updated or created
  • The relevant tests, if any, have been updated or created

Tests

Tests included in main.test.ts that can later be extended to check for usage in other providers

@sestinj sestinj requested a review from a team as a code owner June 21, 2025 23:27
@sestinj sestinj requested review from RomneyDa and removed request for a team June 21, 2025 23:27
Copy link

netlify bot commented Jun 21, 2025

Deploy Preview for continuedev canceled.

Name Link
🔨 Latest commit 4f5263a
🔍 Latest deploy log https://app.netlify.com/projects/continuedev/deploys/686c81b60d89e500085d215c

@dosubot dosubot bot added the size:M This PR changes 30-99 lines, ignoring generated files. label Jun 21, 2025
@@ -223,6 +235,15 @@ export class AnthropicApi implements BaseLlmApi {
lastToolUseName = value.content_block.name;
}
break;
case "message_start":
usage.prompt_tokens = value.message.usage.input_tokens;
usage.prompt_tokens_details = {
Copy link

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Type Error: The code attempts to assign prompt_tokens_details to the usage object, but this property is not defined in the OpenAI CompletionUsage type. The CompletionUsage type only includes completion_tokens, prompt_tokens, and total_tokens. This assignment will cause a TypeScript error and may lead to runtime issues when trying to access undefined properties.


React with 👍 to tell me that this comment was useful, or 👎 if not (and I'll stop posting more comments like this in the future)

Copy link

recurseml bot commented Jun 21, 2025

😱 Found 1 issue. Time to roll up your sleeves! 😱

@dosubot dosubot bot added size:L This PR changes 100-499 lines, ignoring generated files. and removed size:M This PR changes 30-99 lines, ignoring generated files. labels Jul 8, 2025
@dosubot dosubot bot added size:XL This PR changes 500-999 lines, ignoring generated files. and removed size:L This PR changes 100-499 lines, ignoring generated files. labels Jul 8, 2025
cacheWrite: 18.75,
cacheRead: 1.5,
},
"claude-3-opus-20240229": {
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Should we hardcode prices?

@github-project-automation github-project-automation bot moved this from Todo to In Progress in Issues and PRs Jul 8, 2025
@dosubot dosubot bot added the lgtm This PR has been approved by a maintainer label Jul 8, 2025
@RomneyDa RomneyDa marked this pull request as draft July 9, 2025 22:49
@RomneyDa
Copy link
Collaborator

RomneyDa commented Jul 9, 2025

WIP, updated to draft

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
lgtm This PR has been approved by a maintainer size:XL This PR changes 500-999 lines, ignoring generated files.
Projects
Status: In Progress
Development

Successfully merging this pull request may close these issues.

2 participants