Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

.Net: Ollama AI Connector #3603

Closed
wants to merge 4 commits into from
Closed

Conversation

BLaZeKiLL
Copy link

Motivation and Context

I am currently participating in Microsoft AI Classroom Hackathon for which I am building an app using semantic kernel. Since I have a student account with Azure so don't have access to Azure Open AI services, but Ollama allows us to self host LLM models based on Llama2

  1. Why is this change required?
    With this change semantic kernel can use Ollama as an AI connector.
  2. What problem does it solve?
    Provides more options for AI services and allows integration with self hosted Ollama instances
  3. What scenario does it contribute to?
    Plugin and memory plugins created by the community
  4. If it fixes an open issue, please link to the issue here.
    N/A

Description

I followed the implementation of the similar feature in python #3055 and HuggingFace Connector as it also operates over Http.
I have also added a PingOllamaAsync method which can be used to check if the specified model is available on the ollama instance pointed by ollama base url.

I was unable to execute all tests as I don't have access to OpenAI or Azure OpenAI.

I wasn't able to test the streaming implementation, not sure if it works. I would appreciate any feedback of the implementation as I am still new to semantic kernel.

Contribution Checklist

@BLaZeKiLL BLaZeKiLL requested a review from a team as a code owner November 22, 2023 07:49
@shawncal shawncal added .NET Issue or Pull requests regarding .NET code kernel Issues or pull requests impacting the core kernel labels Nov 22, 2023
@github-actions github-actions bot changed the title Ollama AI Connector .Net: Ollama AI Connector Nov 22, 2023
@BLaZeKiLL
Copy link
Author

@BLaZeKiLL please read the following Contributor License Agreement(CLA). If you agree with the CLA, please reply with the following information.

@microsoft-github-policy-service agree [company="{your company}"]

Options:

  • (default - no company specified) I have sole ownership of intellectual property rights to my Submissions and I am not making Submissions in the course of work for my employer.
@microsoft-github-policy-service agree
  • (when company given) I am making Submissions in the course of work for my employer (or my employer has intellectual property rights in my Submissions by contract or applicable law). I have permission from my employer to make Submissions and enter into this Agreement on behalf of my employer. By signing below, the defined term “You” includes me and my employer.
@microsoft-github-policy-service agree company="Microsoft"

Contributor License Agreement

@microsoft-github-policy-service agree

@markwallace-microsoft
Copy link
Member

markwallace-microsoft commented Nov 22, 2023

Thanks @BLaZeKiLL we'll take a look at this soon and provide feedback if anything needs to be fixed before we merge

@awaescher
Copy link

Looking forward to this!

Maybe you could simplify the code in OllamaTextCompletion with OllamaSharp

var uri = new Uri("http://localhost:11434");
var ollama = new OllamaApiClient(uri);
await ollama.StreamCompletion("How are you today?", "llama2", stream => Console.Write(stream.Response));

@BLaZeKiLL
Copy link
Author

Looking forward to this!

Maybe you could simplify the code in OllamaTextCompletion with OllamaSharp

var uri = new Uri("http://localhost:11434");
var ollama = new OllamaApiClient(uri);
await ollama.StreamCompletion("How are you today?", "llama2", stream => Console.Write(stream.Response));

I'll take a look, If plugins are to be hosted separately it won't be an issue to add ollama sharp as a dependency

@BLaZeKiLL
Copy link
Author

I have released a nuget with the above ollama connector as well as support for chat completion and embedding generation you can check it out here - https://www.nuget.org/packages/Codeblaze.SemanticKernel.Connectors.AI.Ollama

@markwallace-microsoft
Copy link
Member

@BLaZeKiLL Thanks for your contribution. @RogerBarreto is working on our strategy to expand our AI Connectors. We will have an ADR available soon describing the approach we want to use and we will work with you to progress your PR.

@BLaZeKiLL
Copy link
Author

Looking forward to it

@markwallace-microsoft
Copy link
Member

Ollama have announced OpenAI compatibility, see: https://ollama.ai/blog/openai-compatibility

So we may be able to use our existing OpenAI connector. @RogerBarreto can you investigate is this the case.

@RogerBarreto
Copy link
Member

Closing this PR and assigning the original Author @BLaZeKiLL to the new one #4789

Please ping me if you have any questions.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
kernel Issues or pull requests impacting the core kernel .NET Issue or Pull requests regarding .NET code
Projects
Archived in project
Development

Successfully merging this pull request may close these issues.

None yet

5 participants