We read every piece of feedback, and take your input very seriously.
To see all available qualifiers, see our documentation.
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
The maxChatCompletions option is very helpful in managing costs. However, it is not exposed through the interface due to a typing bug.
maxChatCompletions
I think the fix is to simply change the Core.RequestOptions type to RunnerOptions which only contains maxCompletions at the moment.
Try passing maxCompletions to the runTools/runFunctions options
maxCompletions
No response
macOS
v18.16.0
v4.24.0
The text was updated successfully, but these errors were encountered:
Casting the options with maxCompletions to RequestOptions is a workaround.
RequestOptions
{ maxChatCompletions: 1 } as RequestOptions
Sorry, something went wrong.
Thanks, we'll try to fix this soon!
No branches or pull requests
Confirm this is a Node library issue and not an underlying OpenAI API issue
Describe the bug
The
maxChatCompletions
option is very helpful in managing costs. However, it is not exposed through the interface due to a typing bug.I think the fix is to simply change the Core.RequestOptions type to RunnerOptions which only contains maxCompletions at the moment.
To Reproduce
Try passing
maxCompletions
to the runTools/runFunctions optionsCode snippets
No response
OS
macOS
Node version
v18.16.0
Library version
v4.24.0
The text was updated successfully, but these errors were encountered: