Skip to content

json_object response_format not behaving as expected on Azure gpt-3.5-turbo-0125 #1465

@milstan

Description

@milstan

Confirm this is an issue with the Python library and not an underlying OpenAI API

  • This is an issue with the Python library

Describe the bug

Using the OpenAI python library with an Azure OpenAI instance, I am trying ot generate a json response guaranteed to be in json format (as only including it in text promt sometimes yields inadequate results).

For a request with the following parameters:

'model': 'gpt-3.5-turbo-0125', 'response_format': {'type': 'json_object'}

I am getting the following error:

openai.BadRequestError: Error code: 400 - {'error': {'message': "Invalid parameter: 'response_format' of type 'json_object' is not supported with this model.", 'type': 'invalid_request_error', 'param': 'response_format', 'code': None}}

Yet, my understnading is that (according to the Azure documentation page):

  • json mode is supported by the model gpt-35-turbo (0125)
  • and the 2024-02-01 API version which I use, supports json_object response_format

To Reproduce

  1. Set up an Azure OpenAI instance
  2. from openai.lib.azure import AsyncAzureOpenAI

client = AsyncAzureOpenAI(
api_key=,
api_version="2024-02-01",
azure_endpoint=,
azure_deployment=
)

response = await client.chat.completions.create(
model= "gpt-3.5-turbo-0125",
messages=[
{"role": "user", "content": },
],
response_format={'type': 'json_object'}
)

print (response.choices[0].message.content)

Code snippets

No response

OS

masOS

Python version

Python 3.11.2

Library version

openai v1.30.4

Metadata

Metadata

Assignees

No one assigned

    Labels

    bugSomething isn't working

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions