Description
Describe the bug
Response format is not support with AzureAIInferenceChatPromptExecutionSettings. Azure AI Inference chat completion API supports response_format but prompt execution settings do not support this.
To Reproduce
None of the below pass response_format to the underlying completions call -
response_format = JsonSchemaFormat(
name="Recipe_JSON_Schema",
schema=Recipe.model_json_schema(),
description="",
strict=True,
)
AzureAIInferenceChatPromptExecutionSettings(
service_id="service-id", response_format=response_format
)
AzureAIInferenceChatPromptExecutionSettings(
service_id="service-id", extra_parameters={"response_format": response_format}
)
Expected behavior
Screenshots

Platform
- Language: Python
- Source: pip package version 1.26.1
- AI model: [e.g. OpenAI:GPT-4o-mini(2024-07-18)]
- IDE: [e.g. Visual Studio, VS Code]
- OS: [e.g. Windows, Mac]
Additional context
Add any other context about the problem here.
Metadata
Metadata
Assignees
Type
Projects
Status
No status