How to send KernelArguments towards KernelFunctions when using IChatCompletionService #6457
-
My goal is that the user can have a conversation with the model through Semantic Kernel's AzureOpenAIChatCompletion service. I cannot seem to find a way to initialize a Kernel Arguments arguments like
An that could be used in a plugin as
Where do I pass the KernelArguments into? If it was a one-time prompt I could easily use This is a rudimentary example where I struggle to pass in the KernelArguments :)
|
Beta Was this translation helpful? Give feedback.
Replies: 1 comment 1 reply
-
@ystvan Thank you for your question! There are a couple of ways how to achieve your scenario:
Please let us know if any of these approaches work for you, thanks! |
Beta Was this translation helpful? Give feedback.
@ystvan Thank you for your question!
There are a couple of ways how to achieve your scenario:
securityFilters
property does not contain PII or any other sensitive information, in your chat history you can inject these details as part of user or system message. Since auto function invocation is enabled, LLM will be able to use that information and Semantic Kernel will inject it into your plugin.securityFilters
content with LLM, you could usekernel.Data
property bag to set this information. In your plugin, you can inject kernel instance, which should contain that property, here is an example ofKernel
injection:semantic-kernel/dotnet/samples/Concepts/Fu…