Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Allow importing functions to kernel within a running function so that AI can use that in the next response #4785

Closed
kboom opened this issue Jan 30, 2024 · 8 comments · Fixed by #5376
Assignees
Labels
follow up Issues that require a follow up from the community. kernel Issues or pull requests impacting the core kernel

Comments

@kboom
Copy link

kboom commented Jan 30, 2024


name: Feature request
about: Allow importing functions to kernel within a running function so that AI can use that in the next response


Today, the functions are invoked with a cloned kernel and importing functions in this kernel has no effect. This means that we cannot dynamically plug in new functions because of a function being called. Conversely, it should also be possible to take a plugin out, also in a function. This would go a long way in prevention of hallucination. This would work like a graph - you start at node X and we load all nodes that make sense to go to from X and this is all dynamic, based on the results of the actual function call. For example, if the function provides reasonable results, we can offer the means to dig into this a bit more. Functions that are engaged should be usable without the user having to create a next prompt.

@stephentoub
Copy link
Member

Today, the functions are invoked with a cloned kernel and importing functions in this kernel has no effect.

What language are you using? Can you provide an example? Additional plugins can be added to a Kernel as part of a function invocation; the Kernel is not cloned automatically, by design.

Here's an example: this creates the kernel and adds a single plugin with a single function which, when invoked, will add another plugin to the kernel:

using Microsoft.SemanticKernel;

Kernel kernel = new();

kernel.ImportPluginFromFunctions("MyFunctions",
[
    kernel.CreateFunctionFromMethod((Kernel kernel) => kernel.ImportPluginFromType<SomePluginObject>(Guid.NewGuid().ToString("N")), "AddRandomPlugin"),
]);

for (int i = 0; i < 9; i++)
{
    await kernel.InvokeAsync("MyFunctions", "AddRandomPlugin");
    Console.WriteLine(kernel.Plugins.Count);
}

class SomePluginObject
{
    [KernelFunction]
    public void Whatever() { }
}

When I run that, it prints out:

2
3
4
5
6
7
8
9
10

@matthewbolanos matthewbolanos added follow up Issues that require a follow up from the community. and removed triage labels Jan 30, 2024
@matthewbolanos matthewbolanos added the kernel Issues or pull requests impacting the core kernel label Jan 30, 2024
@kboom
Copy link
Author

kboom commented Jan 31, 2024

Thanks @stephentoub - the example below doesn't work. It's C#.

        IKernelBuilder builder = Kernel.CreateBuilder();
        builder.AddAzureOpenAIChatCompletion(TestConfiguration.AzureOpenAI.DeploymentName, TestConfiguration.AzureOpenAI.Endpoint, new DefaultAzureCredential(true));
        builder.Services.AddLogging(services => services.AddConsole().SetMinimumLevel(LogLevel.Trace));
        Kernel kernel = builder.Build();

kernel.Plugins.Add(KernelPluginFactory.CreateFromFunctions("HelperFunctions", new[]
{
    kernel.CreateFunctionFromMethod((Kernel kernel) => {
        kernel.Plugins.Add(KernelPluginFactory.CreateFromFunctions("HelperFunctions2", new[] {
            kernel.CreateFunctionFromMethod(() => "Gurgul", "GetLastName", "Gets the last name of the user"),
        }));

        return "Greg";
    }, "GetFirstName", "Gets the first name of the user"),
}));

OpenAIPromptExecutionSettings settings = new() { ToolCallBehavior = ToolCallBehavior.AutoInvokeKernelFunctions };
var chat = kernel.GetRequiredService<IChatCompletionService>();
var chatHistory = new ChatHistory();

while (true)
{
    Console.Write("Question: ");
    string question = Console.ReadLine() ?? string.Empty;
    if (question == "done")
    {
        break;
    }

    chatHistory.AddUserMessage(question);
    StringBuilder sb = new();
    await foreach (var update in chat.GetStreamingChatMessageContentsAsync(chatHistory, settings, kernel))
    {
        if (update.Content is not null)
        {
            Console.Write(update.Content);
            sb.Append(update.Content);
        }
    }
    chatHistory.AddAssistantMessage(sb.ToString());
    Console.WriteLine();
}

Output:

Question: What is my first and last name? Use functions available to you.
trce: Microsoft.SemanticKernel.Connectors.OpenAI.AzureOpenAIChatCompletionService[0]
      Function call requests: HelperFunctions_GetFirstName({})
info: GetFirstName[0]
      Function GetFirstName invoking.
trce: GetFirstName[0]
      Function arguments: {}
trce: Example59_OpenAIFunctionCalling.<>c[0]
      Created KernelFunction 'GetLastName' for '<RunAsync>b__0_2'
info: GetFirstName[0]
      Function GetFirstName succeeded.
trce: GetFirstName[0]
      Function result: Greg
info: GetFirstName[0]
      Function completed. Duration: 0.0022244s
Your first name is Greg. However, I'm unable to retrieve your last name as I only have access to the tool that provides the first name. If you'd like, you can tell me your last name or if there is anything else I can assist you with, let me know!
Question:

I would expect it to be able to use the new function, but it doesn't. However, going forward:

Question: Try again
info: GetFirstName[0]
      Function completed. Duration: 0.0988838s
info: GetLastName[0]
      Function GetLastName invoking.
trce: GetLastName[0]
      Function arguments: {}
Your full name is Greg Gurgul. If there's anything else I can assist you with, please let me know!
Question:

This proves that it's not a cloned kernel indeed, but the issue persists. It always requires user input to be able to use a function it just discovered. As a side note, it would be good to add TryImport methods since that would allow idempotent behavior. Also, TryRemove would be great, to dynamically engage and disengage plugins.

@kboom
Copy link
Author

kboom commented Feb 7, 2024

@stephentoub does this make sense?

@stephentoub
Copy link
Member

@stephentoub does this make sense?

Yes, thanks. The behavior you're seeing matches how it's implemented: for a given call into the IChatCompletionService, it currently loads the kernel functions once and doesn't attempt to reload them after invoking a function as part of a function call request. If that function modifies the plugins in the kernel, the kernel is indeed changed, but the tool calls used as part of that same turn aren't repopulated. The next time you call into the IChatCompletionService and it loads the functions from the kernel, it will pick up the latest set, though.

Seems reasonable to tweak this, though, such that the tool call settings are regenerated before sending the function call results back up to the model.

@kboom
Copy link
Author

kboom commented Feb 15, 2024

@stephentoub it would be truly awesome if this gets implemented, even more so as it doesn't look like a complicated change (I hope). This unlocks many exciting possibilities.

@kboom
Copy link
Author

kboom commented Feb 29, 2024

@stephentoub is this PR what you had in mind?

@stephentoub
Copy link
Member

@kboom, sorry, I missed the notification. That's close but not quite what I had in mind; that will re-add functions even if they already exist. I'll put up a PR.

@stephentoub
Copy link
Member

#5376

github-merge-queue bot pushed a commit that referenced this issue Mar 8, 2024
…ion (#5376)

A KernelFunction can change when plug-ins are in the Kernel. Today, we
build up the list of tools at the beginning of the auto-invocation loop
but then don't update the list after a function has been invoked, which
means those updates won't be reflected in subsequent calls made as part
of the response to the LLM. This PR updates the loop to regenerate the
tool call information after each function invocation, in case they've
been changed dynamically.

Fixes #4785
LudoCorporateShark pushed a commit to LudoCorporateShark/semantic-kernel that referenced this issue Aug 25, 2024
…ion (microsoft#5376)

A KernelFunction can change when plug-ins are in the Kernel. Today, we
build up the list of tools at the beginning of the auto-invocation loop
but then don't update the list after a function has been invoked, which
means those updates won't be reflected in subsequent calls made as part
of the response to the LLM. This PR updates the loop to regenerate the
tool call information after each function invocation, in case they've
been changed dynamically.

Fixes microsoft#4785
Bryan-Roe pushed a commit to Bryan-Roe/semantic-kernel that referenced this issue Oct 6, 2024
…ion (microsoft#5376)

A KernelFunction can change when plug-ins are in the Kernel. Today, we
build up the list of tools at the beginning of the auto-invocation loop
but then don't update the list after a function has been invoked, which
means those updates won't be reflected in subsequent calls made as part
of the response to the LLM. This PR updates the loop to regenerate the
tool call information after each function invocation, in case they've
been changed dynamically.

Fixes microsoft#4785
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
follow up Issues that require a follow up from the community. kernel Issues or pull requests impacting the core kernel
Projects
Archived in project
Development

Successfully merging a pull request may close this issue.

5 participants