-
Notifications
You must be signed in to change notification settings - Fork 1.3k
Description
Confirm this is a Node library issue and not an underlying OpenAI API issue
- This is an issue with the Node library
Describe the bug
Description
The OpenAI Node.js client merges the default temperature setting into all API requests, including Assistants API run creations. However, the o3-mini reasoning model does not support sampling parameters like temperature, causing any inclusion of this parameter to result in a 400 error: “Unsupported parameter: 'temperature' is not supported with this model.”
Expected Behavior
The SDK should automatically omit unsupported sampling parameters for reasoning models like o3-mini, allowing the run creation to succeed without errors.
Actual Behavior
The client continues to merge the temperature default into the run creation request payload, and the Assistants API rejects it with a 400 error.
Notes
The error was previously reporting in #1318, but no further action was performed to fix it, and people are still getting it (me included).
To Reproduce
- Instantiate the client with a default temperature:
const client = new OpenAI({
apiKey: process.env.OPENAI_API_KEY,
temperature: 1
});- Create an assistant and a thread, then attempt to create a run with
o3-mini:
await client.beta.threads.runs.create(thread.id, {
model: 'o3-mini',
reasoning_effort: 'medium',
instructions: 'Some instructions.',
assistant_id: assistantId,
stream: true,
});- Observe the 400 error:
400 Unsupported parameter: 'temperature' is not supported with this model.
Code snippets
OS
Ubuntu 25.04
Node version
Node.js 22.14.0
Library version
openai 4.95.1