Skip to content

.Net: Bug: Ollama function calling stopped working #11521

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Open
csharpyoudull opened this issue Apr 11, 2025 · 3 comments
Open

.Net: Bug: Ollama function calling stopped working #11521

csharpyoudull opened this issue Apr 11, 2025 · 3 comments
Assignees
Labels
blocked This issue is blocked from making progress bug Something isn't working needs more info Any issue that is requires more information from contributors .NET Issue or Pull requests regarding .NET code

Comments

@csharpyoudull
Copy link

I have semantic functions in C# that are still working with OpenAI and were previously also working with the Ollama connector, I have updated to version 1.46.0 and have ollama version 0.6.5 running with model llama3.1:latest

@csharpyoudull csharpyoudull added the bug Something isn't working label Apr 11, 2025
@markwallace-microsoft markwallace-microsoft added .NET Issue or Pull requests regarding .NET code triage labels Apr 11, 2025
@github-actions github-actions bot changed the title Bug: Ollama function calling stopped working .Net: Bug: Ollama function calling stopped working Apr 11, 2025
@RogerBarreto
Copy link
Member

Hi, @csharpyoudull Can you please provide some more details and how can I reproduce the given problem you are witnessing.

Would be ideal if you could point me out what are the packages and versions your project is using as well.

Thanks.

@RogerBarreto RogerBarreto added follow up Issues that require a follow up from the community. needs more info Any issue that is requires more information from contributors labels Apr 14, 2025
@RogerBarreto RogerBarreto removed the status in Semantic Kernel Apr 14, 2025
@RogerBarreto RogerBarreto added blocked This issue is blocked from making progress and removed follow up Issues that require a follow up from the community. labels Apr 14, 2025
@csharpyoudull
Copy link
Author

Our previously working tool calls ahve stopped working using the ollama connector, below is how we're configuringthe service.

`` builder.AddOllamaChatCompletion("llama3.1:latest", new Uri("http://localhost:11434/"));

and setting up my kernel functions

 foreach (var plugin in plugins)
    builder.Plugins.AddFromObject(plugin);

The objects passed as plugins have public async functions that return objects or strings, each function has [KernelFunction]
[Description("Some description")] attributes and the parameters for these functions also have description attributes.

In 1.36.0 version of the Ollama connector with same SK version these tools were being used since upgrading they are not, these tools do still work when we get chat completion from Open AI setting up the same way with :
builder.AddOpenAIChatCompletion(modelName, key);

This is the code being used to execute requests .

var chatCompletion = kernel.GetRequiredService<IChatCompletionService>();
var settings = new PromptExecutionSettings
{
    FunctionChoiceBehavior = FunctionChoiceBehavior.Auto()
};

var response = await chatCompletion.GetChatMessageContentsAsync(
    history,
    settings,
    kernel,
    ct);

The ollama windows desktop version is 0.6.5

@RogerBarreto
Copy link
Member

RogerBarreto commented Apr 15, 2025

TLDR, I didn't find any issue and all plugins worked as expected from our Demonstration code against Ollama 0.6.5 and latest Semantic Kernel version.

  <ItemGroup>
    <PackageReference Include="Microsoft.SemanticKernel.Connectors.Ollama" Version="1.46.0-alpha" />
    <PackageReference Include="Microsoft.SemanticKernel.Core" Version="1.46.0" />
  </ItemGroup>

Since I didn't have information from how you configured your plugins, I pretty much relying on our demonstration plugins available here:

https://github.com/microsoft/semantic-kernel/tree/main/dotnet/samples/Demos/OllamaFunctionCalling

using Microsoft.Extensions.DependencyInjection;
using Microsoft.SemanticKernel;
using Microsoft.SemanticKernel.ChatCompletion;
using OllamaFunctionCalling;

#pragma warning disable SKEXP0070 // Type is for evaluation purposes only and is subject to change or removal in future updates. Suppress this diagnostic to proceed.

var builder = Kernel.CreateBuilder()
    .AddOllamaChatCompletion("llama3.1:latest", new Uri("http://localhost:11434"));

builder.Plugins
    .AddFromType<MyTimePlugin>()
    .AddFromObject(new MyLightPlugin(turnedOn: true))
    .AddFromObject(new MyAlarmPlugin("11"));

builder.Services.AddSingleton<IFunctionInvocationFilter, MyFunctionInvocationFilter>();

var kernel = builder.Build();
    
var chatCompletion = kernel.GetRequiredService<IChatCompletionService>();
var settings = new PromptExecutionSettings
{
    FunctionChoiceBehavior = FunctionChoiceBehavior.Auto()
};

Console.WriteLine("""
    Ask questions or give instructions to the copilot such as:
    - Change the alarm to 8
    - What is the current alarm set?
    - Is the light on?
    - Turn the light off please.
    - Set an alarm for 6:00 am.
    """);

Console.Write("> ");

while (true)
{
    var input = Console.ReadLine();
    if (string.IsNullOrEmpty(input)) break;

    try
    {
        ChatMessageContent chatResult = await chatCompletion.GetChatMessageContentAsync(input, settings, kernel);
        Console.Write($"\n>>> Result: {chatResult}\n\n> ");
    }
    catch (Exception ex)
    {
        Console.WriteLine($"Error: {ex.Message}\n\n> ");
    }
}

public class MyFunctionInvocationFilter : IFunctionInvocationFilter
{
    public async Task OnFunctionInvocationAsync(FunctionInvocationContext context, Func<FunctionInvocationContext, Task> next)
    {
        await next(context);

        Console.ForegroundColor = ConsoleColor.Yellow;
        Console.WriteLine($"Invoking function {context.Function.Name} - Result: {context.Result}");
        Console.ResetColor();
    }
}

Result prompt:
Image

Please consider giving more information about dependencies and versions used in your project as well as current plugins, so far I could not reproduce your problem.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
blocked This issue is blocked from making progress bug Something isn't working needs more info Any issue that is requires more information from contributors .NET Issue or Pull requests regarding .NET code
Projects
Status: Bug
Development

No branches or pull requests

4 participants