title | titleSuffix | description | author | ms.topic | ms.date |
---|---|---|---|---|---|
Use Custom and Local AI Models with the Semantic Kernel SDK for .NET |
Learn how to use custom or local models for text generation and chat completions in Semantic Kernel SDK for .NET. |
haywoodsloan |
how-to |
04/11/2024 |
This article demonstrates how to integrate custom and local models into the Semantic Kernel SDK and use them for text generation and chat completions.
You can adapt the steps to use them with any model that you can access, regardless of where or how you access it. For example, you can integrate the codellama model with the Semantic Kernel SDK to enable code generation and discussion.
Custom and local models often provide access via REST APIs, for example see Ollama OpenAI compatibility. Before you integrate your model it will need to be hosted and accessible to your .NET application via HTTPS.
- An Azure account with an active subscription. Create an account for free.
- .NET SDK
Microsoft.SemanticKernel
NuGet package- A custom or local model, deployed and accessible to your .NET application
The following section shows how you can integrate your model with the Semantic Kernel SDK and then use it to generate text completions.
-
Create a service class that implements the
ITextGenerationService
interface. For example::::code language="csharp" source="./snippets/semantic-kernel/services/MyTextGenerationService.cs" id="service":::
-
Include the new service class when building the
Kernel
. For example::::code language="csharp" source="./snippets/semantic-kernel/LocalModelExamples.cs" id="addTextService":::
-
Send a text generation prompt to your model directly through the
Kernel
or using the service class. For example::::code language="csharp" source="./snippets/semantic-kernel/LocalModelExamples.cs" id="useTextService":::
The following section shows how you can integrate your model with the Semantic Kernel SDK and then use it for chat completions.
-
Create a service class that implements the
IChatCompletionService
interface. For example::::code language="csharp" source="./snippets/semantic-kernel/services/MyChatCompletionService.cs" id="service":::
-
Include the new service class when building the
Kernel
. For example::::code language="csharp" source="./snippets/semantic-kernel/LocalModelExamples.cs" id="addChatService":::
-
Send a chat completion prompt to your model directly through the
Kernel
or using the service class. For example::::code language="csharp" source="./snippets/semantic-kernel/LocalModelExamples.cs" id="useChatService":::