-
Notifications
You must be signed in to change notification settings - Fork 3.7k
.Net: Bug: Ollama function calling stopped working #11521
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Comments
Hi, @csharpyoudull Can you please provide some more details and how can I reproduce the given problem you are witnessing. Would be ideal if you could point me out what are the packages and versions your project is using as well. Thanks. |
Our previously working tool calls ahve stopped working using the ollama connector, below is how we're configuringthe service. `` builder.AddOllamaChatCompletion("llama3.1:latest", new Uri("http://localhost:11434/")); and setting up my kernel functions
The objects passed as plugins have public async functions that return objects or strings, each function has [KernelFunction] In 1.36.0 version of the Ollama connector with same SK version these tools were being used since upgrading they are not, these tools do still work when we get chat completion from Open AI setting up the same way with : This is the code being used to execute requests .
The ollama windows desktop version is 0.6.5 |
TLDR, I didn't find any issue and all plugins worked as expected from our Demonstration code against Ollama 0.6.5 and latest Semantic Kernel version.
Since I didn't have information from how you configured your plugins, I pretty much relying on our demonstration plugins available here: https://github.com/microsoft/semantic-kernel/tree/main/dotnet/samples/Demos/OllamaFunctionCalling using Microsoft.Extensions.DependencyInjection;
using Microsoft.SemanticKernel;
using Microsoft.SemanticKernel.ChatCompletion;
using OllamaFunctionCalling;
#pragma warning disable SKEXP0070 // Type is for evaluation purposes only and is subject to change or removal in future updates. Suppress this diagnostic to proceed.
var builder = Kernel.CreateBuilder()
.AddOllamaChatCompletion("llama3.1:latest", new Uri("http://localhost:11434"));
builder.Plugins
.AddFromType<MyTimePlugin>()
.AddFromObject(new MyLightPlugin(turnedOn: true))
.AddFromObject(new MyAlarmPlugin("11"));
builder.Services.AddSingleton<IFunctionInvocationFilter, MyFunctionInvocationFilter>();
var kernel = builder.Build();
var chatCompletion = kernel.GetRequiredService<IChatCompletionService>();
var settings = new PromptExecutionSettings
{
FunctionChoiceBehavior = FunctionChoiceBehavior.Auto()
};
Console.WriteLine("""
Ask questions or give instructions to the copilot such as:
- Change the alarm to 8
- What is the current alarm set?
- Is the light on?
- Turn the light off please.
- Set an alarm for 6:00 am.
""");
Console.Write("> ");
while (true)
{
var input = Console.ReadLine();
if (string.IsNullOrEmpty(input)) break;
try
{
ChatMessageContent chatResult = await chatCompletion.GetChatMessageContentAsync(input, settings, kernel);
Console.Write($"\n>>> Result: {chatResult}\n\n> ");
}
catch (Exception ex)
{
Console.WriteLine($"Error: {ex.Message}\n\n> ");
}
}
public class MyFunctionInvocationFilter : IFunctionInvocationFilter
{
public async Task OnFunctionInvocationAsync(FunctionInvocationContext context, Func<FunctionInvocationContext, Task> next)
{
await next(context);
Console.ForegroundColor = ConsoleColor.Yellow;
Console.WriteLine($"Invoking function {context.Function.Name} - Result: {context.Result}");
Console.ResetColor();
}
} Please consider giving more information about dependencies and versions used in your project as well as current plugins, so far I could not reproduce your problem. |
I have semantic functions in C# that are still working with OpenAI and were previously also working with the Ollama connector, I have updated to version 1.46.0 and have ollama version 0.6.5 running with model llama3.1:latest
The text was updated successfully, but these errors were encountered: