Closed
Description
Description
It appears in ai
: 4.3.16
with @ai-sdk/openai
: ^1.3.22
when openai.responses("codex-mini-latest")
it still tries to send the temperature
param even though its explicitly set to undefined:
const result = await generateText({
model: openai.responses('codex-mini-latest'),
prompt,
system: opts.system,
tools,
messages: opts.messages,
maxSteps: opts.maxSteps,
toolChoice: undefined,
temperature: undefined,
topP: opts.topP,
experimental_prepareStep: opts.experimental_prepareStep,
});
codex-mini-latest
does not support the temperature param, so it should not be sent or default to 0 to the responses
api for this model. This returns this error in the api:
Unsupported parameter: 'temperature' is not supported with
this model.
AI SDK Version
ai
:4.3.16
@ai-sdk/openai
:^1.3.22
@ai-sdk/openai
:^1.3.22