Skip to content

.Net: Bug: Llama3.2-vision and Ollama Connector not working anymore. #12452

Closed
@tennaito

Description

@tennaito

Describe the bug
The code below worked fine with llama3.2-vision in 1.31 versions (and the Ollama version from that time), but when upgraded to latest version it is not working anymore. Any directions?

I´ve have tested again on 1.31 and it did not worked now in the future... I am lost.

	public static async Task<string> AnalyzeImage(string text, ReadOnlyMemory<byte> image, string? mimeType, OllamaPromptExecutionSettings settings = null, CancellationToken cancellationToken = default)
	{
		settings ??= new OllamaPromptExecutionSettings();
		var kernel = CreateKernel(settings.ModelId ?? "llama3.2-vision");
		var chatCompletionService = kernel.GetRequiredService<IChatCompletionService>();
		ChatHistory input = new ChatHistory();
		ChatMessageContent content = new ChatMessageContent();  
		ChatMessageContentItemCollection items = new ChatMessageContentItemCollection();
		items.Add(new TextContent(text));
		items.Add(new ImageContent(image, mimeType));
		input.AddMessage(AuthorRole.User, items);
		ChatMessageContent result = await chatCompletionService.GetChatMessageContentAsync(input, settings, kernel);
		return result.Items.OfType<TextContent>().First().Text;
	}

	public static Kernel CreateKernel(string model = "llama3.1")
	{
		var builder = Kernel.CreateBuilder();
		builder.Services.AddTransient(s =>
		{
			var client = new HttpClient(new HttpClientHandler());
			client.Timeout = TimeSpan.FromMinutes(2);
			return client;
		});
		
		builder.Services.AddOllamaChatCompletion(model, new Uri("http://localhost:11434/"));
		return builder.Build();
	}

To Reproduce
Execute method AnalyzeImage.

Expected behavior
No error on version upgrade.

On the 1.31 version it returned the analysis of the image with the text context from user input (nowdays same error).
On the 1.56 version it returns:

Response status code does not indicate success: 500 (Internal Server Error).

at System.Net.Http.HttpResponseMessage.EnsureSuccessStatusCode()
at OllamaSharp.OllamaApiClient.EnsureSuccessStatusCodeAsync(HttpResponseMessage response)
at OllamaSharp.OllamaApiClient.SendToOllamaAsync(HttpRequestMessage requestMessage, OllamaRequest ollamaRequest, HttpCompletionOption completionOption, CancellationToken cancellationToken)
at OllamaSharp.OllamaApiClient.ChatAsync(ChatRequest request, CancellationToken cancellationToken)+MoveNext()
at OllamaSharp.OllamaApiClient.ChatAsync(ChatRequest request, CancellationToken cancellationToken)+System.Threading.Tasks.Sources.IValueTaskSource<System.Boolean>.GetResult()
at OllamaSharp.IAsyncEnumerableExtensions.StreamToEndAsync[Tin,Tout](IAsyncEnumerable`1 stream, IAppender`2 appender, Action`1 itemCallback)
at OllamaSharp.IAsyncEnumerableExtensions.StreamToEndAsync[Tin,Tout](IAsyncEnumerable`1 stream, IAppender`2 appender, Action`1 itemCallback)
at OllamaSharp.OllamaApiClient.Microsoft.Extensions.AI.IChatClient.GetResponseAsync(IEnumerable`1 messages, ChatOptions options, CancellationToken cancellationToken)
at Microsoft.Extensions.AI.FunctionInvokingChatClient.GetResponseAsync(IEnumerable`1 messages, ChatOptions options, CancellationToken cancellationToken)
at Microsoft.SemanticKernel.ChatCompletion.ChatClientChatCompletionService.GetChatMessageContentsAsync(ChatHistory chatHistory, PromptExecutionSettings executionSettings, Kernel kernel, CancellationToken cancellationToken)
at Microsoft.SemanticKernel.ChatCompletion.ChatCompletionServiceExtensions.GetChatMessageContentAsync(IChatCompletionService chatCompletionService, ChatHistory chatHistory, PromptExecutionSettings executionSettings, Kernel kernel, CancellationToken cancellationToken)
at MyApplication.Util.OpenAIHelper.AnalyzeVision(String text, ReadOnlyMemory`1 image, String mimeType, OllamaPromptExecutionSettings settings, CancellationToken cancellationToken)

Platform that works

  • Language: C#
  • Source:
    Microsoft.SemanticKernel" Version="1.31.0"
    Microsoft.SemanticKernel.Connectors.Ollama" Version="1.31.0-alpha"
  • AI model: llama3.2-vision
  • IDE: Visual Studio
  • OS: Windows

Platform that does not work

  • Language: C#
  • Source:
    Microsoft.SemanticKernel" Version="1.56.0"
    Microsoft.SemanticKernel.Connectors.Ollama" Version="1.56.0-alpha"
  • AI model: llama3.2-vision
  • IDE: Visual Studio
  • OS: Windows

Metadata

Metadata

Assignees

Labels

.NETIssue or Pull requests regarding .NET code

Type

No type

Projects

Status

Sprint: Done

Milestone

No milestone

Relationships

None yet

Development

No branches or pull requests

Issue actions