Skip to content

Python: Bug: response_format not supported with AzureAIInferenceChatPromptExecutionSettings #11952

@ManniArora

Description

@ManniArora

Describe the bug
Response format is not support with AzureAIInferenceChatPromptExecutionSettings. Azure AI Inference chat completion API supports response_format but prompt execution settings do not support this.

For reference: https://github.com/Azure/azure-sdk-for-python/blob/372556c19c9ee3e3b5c1338c4be5eb287d902472/sdk/ai/azure-ai-inference/azure/ai/inference/aio/_patch.py#L142

To Reproduce
None of the below pass response_format to the underlying completions call -

response_format = JsonSchemaFormat(
            name="Recipe_JSON_Schema",
            schema=Recipe.model_json_schema(),
            description="",
            strict=True,
        )
AzureAIInferenceChatPromptExecutionSettings(
            service_id="service-id", response_format=response_format
        )
AzureAIInferenceChatPromptExecutionSettings(
            service_id="service-id", extra_parameters={"response_format": response_format}
        )

Expected behavior
Screenshots

Image

Platform

  • Language: Python
  • Source: pip package version 1.26.1
  • AI model: [e.g. OpenAI:GPT-4o-mini(2024-07-18)]
  • IDE: [e.g. Visual Studio, VS Code]
  • OS: [e.g. Windows, Mac]

Additional context
Add any other context about the problem here.

Metadata

Metadata

Assignees

Labels

pythonPull requests for the Python Semantic Kernel

Type

Projects

Status

No status

Milestone

No milestone

Relationships

None yet

Development

No branches or pull requests

Issue actions