Skip to content

Specify OpenAI Execution settings per Invoke #5797

Answered by dmytrostruk
LefanTan asked this question in Q&A
Discussion options

You must be logged in to vote

Hi @LefanTan , it's possible to set OpenAI execution settings per Invoke by passing settings model within KernelArguments. Here is the syntax:

var result = await kernel.InvokePromptAsync(
prompt,
new(new OpenAIPromptExecutionSettings()
{
MaxTokens = 60,
Temperature = 0.7
}));
this.WriteLine(result.GetValue<string>());

Please let me know if that resolves your scenario. Thank you!

Replies: 1 comment 3 replies

Comment options

You must be logged in to vote
3 replies
@LefanTan
Comment options

@LefanTan
Comment options

@dmytrostruk
Comment options

Answer selected by LefanTan
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Category
Q&A
Labels
None yet
2 participants