-
Notifications
You must be signed in to change notification settings - Fork 4.2k
Closed as not planned
Labels
questionFurther information is requestedFurther information is requested
Description
Confirm this is an issue with the Python library and not an underlying OpenAI API
- This is an issue with the Python library
Describe the bug
Temperature is not supported in the o3 model. Similar issue was reported earlier(#2072) and it was supposed to be fixed in the 1.61.1
release(#2078)
openai.BadRequestError: Error code: 400 - {'error': {'message': "Unsupported parameter: 'temperature' is not supported with this model.", 'type': 'invalid_request_error', 'param': 'temperature', 'code': 'unsupported_parameter'}}
To Reproduce
messages = [
{"role": "system", "content": "You are an expert"},
{"role": "user", "content": "What is the capital of France"}
]
client = OpenAI(api_key=api_key)
response = client.chat.completions.create(
model="o3-mini",
messages=messages,
temperature=0
)
print(response.choices[0].message.content)
Using this code, the above error pops up.
Code snippets
OS
macOS
Python version
Python 3.13.1
Library version
openai 1.61.1
Metadata
Metadata
Assignees
Labels
questionFurther information is requestedFurther information is requested