Skip to content

Conversation

@samurai00
Copy link

Background

Cannot pass the stream_options to the OpenAICompatible Providers.

Summary

  • Allow passing config to chat models
  • Refine the stream_options handling in OpenAICompatibleChatLanguageModel to correctly omit the stream_options field when includeUsage is false/undefined, preventing undefined from being sent in the request body.

Verification

Tasks

  • Tests have been added / updated (for bug fixes / features)
  • Documentation has been added / updated (for bug fixes / features)
  • A patch changeset for relevant packages has been added (for bug fixes / features - run pnpm changeset in the project root)
  • Formatting issues have been fixed (run pnpm prettier-fix in the project root)

Future Work

Related Issues

Fixes #6774

@samurai00 samurai00 force-pushed the feat-passing-config-to-chat-models branch from 1d638ae to f4f171d Compare June 25, 2025 02:16
@samurai00 samurai00 changed the title Feat passing config to chat models feat(openai-compatible): passing config to chat models Jun 25, 2025
@lgrammel lgrammel changed the base branch from main to v4 June 30, 2025 07:31
samurai00 added 2 commits July 1, 2025 18:01
Refine the `stream_options` handling in `OpenAICompatibleChatLanguageModel` to correctly omit the `stream_options` field when `includeUsage` is false/undefined, preventing `undefined` from being sent in the request body.
@samurai00 samurai00 force-pushed the feat-passing-config-to-chat-models branch from c70af4b to fb2c151 Compare July 1, 2025 10:01
Comment on lines 388 to 390
stream_options: this.config.includeUsage
? { include_usage: true }
: undefined,
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

we use json stringify, which omits undefined values from objects, so how is this any different in terms of the body that is sent?

Copy link
Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Thanks for the feedback!

With the original code, if includeUsage is false, stream_options is always set to undefined, wiping out any value that providerOptions might have had. The new code only applies the override when includeUsage is true, leaving the original stream_options untouched otherwise.

Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

when would stream_options be included in args?

'@ai-sdk/openai-compatible': patch
---

Allow passing config to chat models; Refine the `stream_options` handling in `OpenAICompatibleChatLanguageModel` to correctly omit the `stream_options` field when `includeUsage` is false/undefined, preventing `undefined` from being sent in the request body.
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

2nd part can be removed

Copy link
Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

So to confirm: should stream_options only be determined by config.includeUsage, and not by what's in providerOptions?

Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

we can change that in v5 but for v4 this is sufficient imo

@lgrammel lgrammel merged commit 2a8a853 into vercel:v4 Jul 2, 2025
7 of 8 checks passed
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Projects

None yet

Development

Successfully merging this pull request may close these issues.

providerOptions.stream_options No Longer Effective in Latest SDK, Preventing Usage Data for DeepSeek/Volcengine

3 participants