-
Notifications
You must be signed in to change notification settings - Fork 3.2k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Allow importing functions to kernel within a running function so that AI can use that in the next response #4785
Comments
What language are you using? Can you provide an example? Additional plugins can be added to a Kernel as part of a function invocation; the Kernel is not cloned automatically, by design. Here's an example: this creates the kernel and adds a single plugin with a single function which, when invoked, will add another plugin to the kernel: using Microsoft.SemanticKernel;
Kernel kernel = new();
kernel.ImportPluginFromFunctions("MyFunctions",
[
kernel.CreateFunctionFromMethod((Kernel kernel) => kernel.ImportPluginFromType<SomePluginObject>(Guid.NewGuid().ToString("N")), "AddRandomPlugin"),
]);
for (int i = 0; i < 9; i++)
{
await kernel.InvokeAsync("MyFunctions", "AddRandomPlugin");
Console.WriteLine(kernel.Plugins.Count);
}
class SomePluginObject
{
[KernelFunction]
public void Whatever() { }
} When I run that, it prints out:
|
Thanks @stephentoub - the example below doesn't work. It's C#. IKernelBuilder builder = Kernel.CreateBuilder();
builder.AddAzureOpenAIChatCompletion(TestConfiguration.AzureOpenAI.DeploymentName, TestConfiguration.AzureOpenAI.Endpoint, new DefaultAzureCredential(true));
builder.Services.AddLogging(services => services.AddConsole().SetMinimumLevel(LogLevel.Trace));
Kernel kernel = builder.Build();
kernel.Plugins.Add(KernelPluginFactory.CreateFromFunctions("HelperFunctions", new[]
{
kernel.CreateFunctionFromMethod((Kernel kernel) => {
kernel.Plugins.Add(KernelPluginFactory.CreateFromFunctions("HelperFunctions2", new[] {
kernel.CreateFunctionFromMethod(() => "Gurgul", "GetLastName", "Gets the last name of the user"),
}));
return "Greg";
}, "GetFirstName", "Gets the first name of the user"),
}));
OpenAIPromptExecutionSettings settings = new() { ToolCallBehavior = ToolCallBehavior.AutoInvokeKernelFunctions };
var chat = kernel.GetRequiredService<IChatCompletionService>();
var chatHistory = new ChatHistory();
while (true)
{
Console.Write("Question: ");
string question = Console.ReadLine() ?? string.Empty;
if (question == "done")
{
break;
}
chatHistory.AddUserMessage(question);
StringBuilder sb = new();
await foreach (var update in chat.GetStreamingChatMessageContentsAsync(chatHistory, settings, kernel))
{
if (update.Content is not null)
{
Console.Write(update.Content);
sb.Append(update.Content);
}
}
chatHistory.AddAssistantMessage(sb.ToString());
Console.WriteLine();
} Output:
I would expect it to be able to use the new function, but it doesn't. However, going forward:
This proves that it's not a cloned kernel indeed, but the issue persists. It always requires user input to be able to use a function it just discovered. As a side note, it would be good to add |
@stephentoub does this make sense? |
Yes, thanks. The behavior you're seeing matches how it's implemented: for a given call into the IChatCompletionService, it currently loads the kernel functions once and doesn't attempt to reload them after invoking a function as part of a function call request. If that function modifies the plugins in the kernel, the kernel is indeed changed, but the tool calls used as part of that same turn aren't repopulated. The next time you call into the IChatCompletionService and it loads the functions from the kernel, it will pick up the latest set, though. Seems reasonable to tweak this, though, such that the tool call settings are regenerated before sending the function call results back up to the model. |
@stephentoub it would be truly awesome if this gets implemented, even more so as it doesn't look like a complicated change (I hope). This unlocks many exciting possibilities. |
@stephentoub is this PR what you had in mind? |
@kboom, sorry, I missed the notification. That's close but not quite what I had in mind; that will re-add functions even if they already exist. I'll put up a PR. |
…ion (#5376) A KernelFunction can change when plug-ins are in the Kernel. Today, we build up the list of tools at the beginning of the auto-invocation loop but then don't update the list after a function has been invoked, which means those updates won't be reflected in subsequent calls made as part of the response to the LLM. This PR updates the loop to regenerate the tool call information after each function invocation, in case they've been changed dynamically. Fixes #4785
…ion (microsoft#5376) A KernelFunction can change when plug-ins are in the Kernel. Today, we build up the list of tools at the beginning of the auto-invocation loop but then don't update the list after a function has been invoked, which means those updates won't be reflected in subsequent calls made as part of the response to the LLM. This PR updates the loop to regenerate the tool call information after each function invocation, in case they've been changed dynamically. Fixes microsoft#4785
…ion (microsoft#5376) A KernelFunction can change when plug-ins are in the Kernel. Today, we build up the list of tools at the beginning of the auto-invocation loop but then don't update the list after a function has been invoked, which means those updates won't be reflected in subsequent calls made as part of the response to the LLM. This PR updates the loop to regenerate the tool call information after each function invocation, in case they've been changed dynamically. Fixes microsoft#4785
name: Feature request
about: Allow importing functions to kernel within a running function so that AI can use that in the next response
Today, the functions are invoked with a cloned kernel and importing functions in this kernel has no effect. This means that we cannot dynamically plug in new functions because of a function being called. Conversely, it should also be possible to take a plugin out, also in a function. This would go a long way in prevention of hallucination. This would work like a graph - you start at node X and we load all nodes that make sense to go to from X and this is all dynamic, based on the results of the actual function call. For example, if the function provides reasonable results, we can offer the means to dig into this a bit more. Functions that are engaged should be usable without the user having to create a next prompt.
The text was updated successfully, but these errors were encountered: