Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

.Net OpenAIPromptExecutionSettings ChatSystemPrompt ignored #4510

Closed
Jenscaasen opened this issue Jan 8, 2024 · 2 comments · Fixed by #4994
Closed

.Net OpenAIPromptExecutionSettings ChatSystemPrompt ignored #4510

Jenscaasen opened this issue Jan 8, 2024 · 2 comments · Fixed by #4994
Assignees
Labels
ai connector Anything related to AI connectors bug Something isn't working .NET Issue or Pull requests regarding .NET code

Comments

@Jenscaasen
Copy link

Describe the bug
When setting the ChatSystemPrompt property of OpenAIPromptExecutionSettings, it is not used as SystemPrompt

To Reproduce
See code below to reproduce

Expected behavior
The ChatSystemPrompt to be used as SystemPrompt, or renaming of ChatSystemPrompt in case it has a different usage

Platform
.NET, SK 1.0.1

Additional context


 var builder = Kernel.CreateBuilder();
 builder.AddAzureOpenAIChatCompletion([...]);    
var kernel = builder.Build();

 AzureOpenAIChatCompletionService service = (AzureOpenAIChatCompletionService)kernel.Services.GetService(typeof(IChatCompletionService));

 string systemPrompt = "You are batman. If asked who you are, say 'I am Batman!'";

 OpenAIPromptExecutionSettings settings = new() { ChatSystemPrompt = systemPrompt };
 ChatHistory history = new ChatHistory();

 history.AddUserMessage("Who are you?");
     
 ChatMessageContent result = await service.GetChatMessageContentAsync(history, settings, kernel);          
 Console.WriteLine(result.ToString()); //Not Batman

 ChatHistory history2 = new ChatHistory(systemPrompt);
 history2.AddUserMessage("Who are you?");

 ChatMessageContent result2 = await service.GetChatMessageContentAsync(history2, settings, kernel);
 Console.WriteLine(result2.ToString()); //Batman
@shawncal shawncal added .NET Issue or Pull requests regarding .NET code triage labels Jan 8, 2024
@github-actions github-actions bot changed the title .NET OpenAIPromptExecutionSettings ChatSystemPrompt ignored .Net OpenAIPromptExecutionSettings ChatSystemPrompt ignored Jan 8, 2024
@matthewbolanos matthewbolanos added the bug Something isn't working label Jan 8, 2024
@matthewbolanos matthewbolanos added the ai connector Anything related to AI connectors label Jan 8, 2024
github-merge-queue bot pushed a commit that referenced this issue Jan 10, 2024
… not ignored (#4530)

### Motivation and Context

Resolves issue #4510 

### Description

Add a system message using the chat system prompt if no system message
is included in the chat history

### Contribution Checklist

<!-- Before submitting this PR, please make sure: -->

- [ ] The code builds clean without any errors or warnings
- [ ] The PR follows the [SK Contribution
Guidelines](https://github.com/microsoft/semantic-kernel/blob/main/CONTRIBUTING.md)
and the [pre-submission formatting
script](https://github.com/microsoft/semantic-kernel/blob/main/CONTRIBUTING.md#development-scripts)
raises no violations
- [ ] All unit tests pass, and I have added new tests where possible
- [ ] I didn't break anyone 😄
@brandonh-msft
Copy link

brandonh-msft commented Feb 12, 2024

Similar repro w/o usage of ChatHistory:

var settings = new OpenAIPromptExecutionSettings
{
    ChatSystemPrompt = "You are batman. If asked who you are, say 'I am Batman!'"
};

await foreach (StreamingKernelContent s in k.InvokePromptStreamingAsync(userPrompt, new(settings)))
{
    Console.Write(s);
}

Intercepting HTTP Client messages will show the following getting sent to the OpenAI endpoint:

{
	"messages": [
		{
			"content": "<userPrompt>",
			"role": "system" // <<<<<--- this should not be 'system', and the 'System' message should be in this array
		}
	],
...
}

seems related to #4377

@dmytrostruk
Copy link
Member

@brandonh-msft Thanks for reporting, I added fix in this PR: #4994

github-merge-queue bot pushed a commit that referenced this issue Feb 13, 2024
### Motivation and Context

<!-- Thank you for your contribution to the semantic-kernel repo!
Please help reviewers and future users, providing the following
information:
  1. Why is this change required?
  2. What problem does it solve?
  3. What scenario does it contribute to?
  4. If it fixes an open issue, please link to the issue here.
-->

Resolves: #4377
Resolves: #4510

When invoking prompt or function from kernel, chat system prompt is
ignored, and prompt is sent to AI as system message instead of user
message.

The example of code that didn't work:
```csharp
var settings = new OpenAIPromptExecutionSettings { ChatSystemPrompt = "Reply \"I don't know\" to every question." };

// Result contains the right answer instead of "I don't know", as it was defined in system prompt.
// That's because the question was set to system message instead of user message and ChatSystemPrompt property was ignored.
var result = await target.InvokePromptAsync("Where is the most famous fish market in Seattle, Washington, USA?", new(settings));
```

This fix may potentially change the behavior of applications that rely
on system message input instead of user message. In order to temporarily
resolve the problem, previous behavior could be achieved by using chat
prompt as in the following example:
```csharp
KernelFunction function = KernelFunctionFactory.CreateFromPrompt(@"
    <message role=""system"">Where is the most famous fish market in Seattle, Washington, USA?</message>
");

var result = await kernel.InvokeAsync(function);
```

But this is just a temporary workaround, and valid usage is presented in
first example above.

### Description

<!-- Describe your changes, the overall approach, the underlying design.
These notes will help understanding how your code works. Thanks! -->

1. Updated `ChatCompletionServiceExtensions` to use prompt as user
message instead of system message.
2. Added unit and integration tests to verify this scenario.

### Contribution Checklist

<!-- Before submitting this PR, please make sure: -->

- [x] The code builds clean without any errors or warnings
- [x] The PR follows the [SK Contribution
Guidelines](https://github.com/microsoft/semantic-kernel/blob/main/CONTRIBUTING.md)
and the [pre-submission formatting
script](https://github.com/microsoft/semantic-kernel/blob/main/CONTRIBUTING.md#development-scripts)
raises no violations
- [x] All unit tests pass, and I have added new tests where possible
- [x] I didn't break anyone 😄
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
ai connector Anything related to AI connectors bug Something isn't working .NET Issue or Pull requests regarding .NET code
Projects
Archived in project
Development

Successfully merging a pull request may close this issue.

6 participants