Skip to content

Latest commit

 

History

History
63 lines (37 loc) · 3.4 KB

work-with-local-models.md

File metadata and controls

63 lines (37 loc) · 3.4 KB
title titleSuffix description author ms.topic ms.date
Use Custom and Local AI Models with the Semantic Kernel SDK for .NET
Learn how to use custom or local models for text generation and chat completions in Semantic Kernel SDK for .NET.
haywoodsloan
how-to
04/11/2024

Use custom and local AI models with the Semantic Kernel SDK

This article demonstrates how to integrate custom and local models into the Semantic Kernel SDK and use them for text generation and chat completions.

You can adapt the steps to use them with any model that you can access, regardless of where or how you access it. For example, you can integrate the codellama model with the Semantic Kernel SDK to enable code generation and discussion.

Custom and local models often provide access via REST APIs, for example see Ollama OpenAI compatibility. Before you integrate your model it will need to be hosted and accessible to your .NET application via HTTPS.

Prerequisites

Implement text generation using a local model

The following section shows how you can integrate your model with the Semantic Kernel SDK and then use it to generate text completions.

  1. Create a service class that implements the ITextGenerationService interface. For example:

    :::code language="csharp" source="./snippets/semantic-kernel/services/MyTextGenerationService.cs" id="service":::

  2. Include the new service class when building the Kernel. For example:

    :::code language="csharp" source="./snippets/semantic-kernel/LocalModelExamples.cs" id="addTextService":::

  3. Send a text generation prompt to your model directly through the Kernel or using the service class. For example:

    :::code language="csharp" source="./snippets/semantic-kernel/LocalModelExamples.cs" id="useTextService":::

Implement chat completion using a local model

The following section shows how you can integrate your model with the Semantic Kernel SDK and then use it for chat completions.

  1. Create a service class that implements the IChatCompletionService interface. For example:

    :::code language="csharp" source="./snippets/semantic-kernel/services/MyChatCompletionService.cs" id="service":::

  2. Include the new service class when building the Kernel. For example:

    :::code language="csharp" source="./snippets/semantic-kernel/LocalModelExamples.cs" id="addChatService":::

  3. Send a chat completion prompt to your model directly through the Kernel or using the service class. For example:

    :::code language="csharp" source="./snippets/semantic-kernel/LocalModelExamples.cs" id="useChatService":::

Related content