-
Notifications
You must be signed in to change notification settings - Fork 498
Closed
Labels
Description
Description:
When using the OpenAI SDK, I encountered an issue where part.usage.promptTokens (or completionTokens) returned NaN, which led to exceptions being thrown during runtime. This seems to occur when the backend model fails to populate the usage field correctly.
Location in Code:
node_modules/@openai/agents-extensions/dist/aiSdk.mjs
Specifically in the finish case handler (around line 505).
- usagePromptTokens = part.usage.promptTokens;
- usageCompletionTokens = part.usage.completionTokens;
+ usagePromptTokens = Number.isNaN(part.usage?.promptTokens) ? 0 : part.usage?.promptTokens;
+ usageCompletionTokens = Number.isNaN(part.usage?.completionTokens) ? 0 : part.usage?.completionTokens;Suggested Fix:
The patch shown above makes the code resilient to NaN values in the usage field by falling back to 0. A similar patch was also applied in aiSdk.js.
Steps to Reproduce:
- Trigger a call to an OpenAI-compatible model with an agent extension.
- Receive a partial response where
usage.promptTokensisNaN. - SDK throws an exception due to math operations involving
NaN.
Expected Behavior:
The SDK should handle undefined or NaN usage tokens gracefully and continue execution without throwing.
Environment:
@openai/agents-extensionsversion: [include version]- Node.js version: [include version]
- Runtime: [e.g., Node, browser, edge function, etc.]
ramnique