Specify OpenAI Execution settings per Invoke #5797
-
Hi all! Currently, we have an internal tool that needs a way to allow users to specify their preferred GPT model (as well as other configs such as Temperature). It seems like in SK, the Models are pre-defined in |
Beta Was this translation helpful? Give feedback.
Replies: 1 comment 3 replies
-
Hi @LefanTan , it's possible to set OpenAI execution settings per Please let me know if that resolves your scenario. Thank you! |
Beta Was this translation helpful? Give feedback.
Hi @LefanTan , it's possible to set OpenAI execution settings per
Invoke
by passing settings model withinKernelArguments
. Here is the syntax:semantic-kernel/dotnet/samples/KernelSyntaxExamples/Example58_ConfigureExecutionSettings.cs
Lines 47 to 54 in 9b8a218
Please let me know if that resolves your scenario. Thank you!