Closed
Description
Describe the bug
I'm using ReponseFormat
on a PromptExecutionSettings
object. In my scenario I am setting it to typeof(Model[])
where Model
is a record type with a simple structure, and I want to return an array of them as the LLM's response.
When the prompt executes, an error is thrown:
Microsoft.SemanticKernel.HttpOperationException: HTTP 400 (invalid_request_error: invalid_value) Parameter: response_format.json_schema.name Invalid 'response_format.json_schema.name': string does not match pattern. Expected a string that matches the pattern '^[a-zA-Z0-9_-]+$'. ---> System.ClientModel.ClientResultException: HTTP 400 (invalid_request_error: invalid_value) Parameter: response_format.json_schema.name Invalid 'response_format.json_schema.name': string does not match pattern. Expected a string that matches the pattern '^[a-zA-Z0-9_-]+$'. at Azure.AI.OpenAI.ClientPipelineExtensions.ProcessMessageAsync(
...
To Reproduce
Steps to reproduce the behavior:
- Set
ResponseFormat
to atypeof()
expression on anOpenAIPromptExecutionSettings
object - Pass that settings obj as
KernelArguments
to anInvokePromptAsync()
call
Expected behavior
The type should be schematized and sent along validly to the Azure OpenAI resource
Screenshots
If applicable, add screenshots to help explain your problem.
Platform
- Language: C#
- Source: Microsoft.SemanticKernel 1.36.1
- AI model: Azure OpenAI:GPT-4o
- IDE: Visual Studio
- OS: Windows
Metadata
Metadata
Assignees
Labels
Type
Projects
Status
Bug