Skip to content

Commit

Permalink
.Net: Add support for arbitrary service attributes - Option #1 (#3415)
Browse files Browse the repository at this point in the history
### Motivation and Context

<!-- Thank you for your contribution to the semantic-kernel repo!
Please help reviewers and future users, providing the following
information:
  1. Why is this change required?
  2. What problem does it solve?
  3. What scenario does it contribute to?
  4. If it fixes an open issue, please link to the issue here.
-->

### Description

<!-- Describe your changes, the overall approach, the underlying design.
These notes will help understanding how your code works. Thanks! -->

### Contribution Checklist

<!-- Before submitting this PR, please make sure: -->

- [ ] The code builds clean without any errors or warnings
- [ ] The PR follows the [SK Contribution
Guidelines](https://github.com/microsoft/semantic-kernel/blob/main/CONTRIBUTING.md)
and the [pre-submission formatting
script](https://github.com/microsoft/semantic-kernel/blob/main/CONTRIBUTING.md#development-scripts)
raises no violations
- [ ] All unit tests pass, and I have added new tests where possible
- [ ] I didn't break anyone 😄
  • Loading branch information
markwallace-microsoft committed Nov 13, 2023
1 parent f46ac77 commit 5e5b4a4
Show file tree
Hide file tree
Showing 39 changed files with 676 additions and 137 deletions.
157 changes: 157 additions & 0 deletions docs/decisions/0021-aiservice-metadata.md
@@ -0,0 +1,157 @@
---
# These are optional elements. Feel free to remove any of them.
status: {proposed}
date: {2023-11-10}
deciders: SergeyMenshykh, markwallace, rbarreto, dmytrostruk
consulted:
informed:
---
# Add AI Service Metadata

## Context and Problem Statement

Developers need to be able to know more information about the `IAIService` that will be used to execute a semantic function or a plan.
Some examples of why they need this information:

1. As an SK developer I want to write a `IAIServiceSelector` which allows me to select the OpenAI service to used based on the configured model id so that I can select the optimum (could eb cheapest) model to use based on the prompt I am executing.
2. As an SK developer I want to write a pre-invocation hook which will compute the token size of a prompt before the prompt is sent to the LLM, so that I can determine the optimum `IAIService` to use. The library I am using to compute the token size of the prompt requires the model id.

Current implementation of `IAIService` is empty.

```csharp
public interface IAIService
{
}
```

We can retrieve `IAIService` instances using `T IKernel.GetService<T>(string? name = null) where T : IAIService;` i.e., by service type and name (aka service id).
The concrete instance of an `IAIService` can have different attributes depending on the service provider e.g. Azure OpenAI has a deployment name and OpenAI services have a model id.

Consider the following code snippet:

```csharp
IKernel kernel = new KernelBuilder()
.WithLoggerFactory(ConsoleLogger.LoggerFactory)
.WithAzureChatCompletionService(
deploymentName: chatDeploymentName,
endpoint: endpoint,
serviceId: "AzureOpenAIChat",
apiKey: apiKey)
.WithOpenAIChatCompletionService(
modelId: openAIModelId,
serviceId: "OpenAIChat",
apiKey: openAIApiKey)
.Build();

var service = kernel.GetService<IChatCompletion>("OpenAIChat");
```

For Azure OpenAI we create the service with a deployment name. This is an arbitrary name specified by the person who deployed the AI model e.g. it could be `eastus-gpt-4` or `foo-bar`.
For OpenAI we create the service with a model id. This must match one of the deployed OpenAI models.

From the perspective of a prompt creator using OpenAI, they will typically tune their prompts based on the model. So when the prompt is executed we need to be able to retrieve the service using the model id. As shown in the code snippet above the `IKernel` only supports retrieving an `IAService` instance by id. Additionally the `IChatCompletion` is a generic interface so it doesn't contain any properties which provide information about a specific connector instance.

## Decision Drivers

* We need a mechanism to store generic metadata for an `IAIService` instance.
* It will be the responsibility of the concrete `IAIService` instance to store the metadata that is relevant e.g., model id for OpenAI and HuggingFace AI services.
* We need to be able to iterate over the available `IAIService` instances.

## Considered Options

* Option #1
* Extend `IAIService` to include the following properties:
* `string? ModelId { get; }` which returns the model id. It will be the responsibility of each `IAIService` implementation to populate this with the appropriate value.
* `IReadOnlyDictionary<string, object> Attributes { get; }` which returns the attributes as a readonly dictionary. It will be the responsibility of each `IAIService` implementation to populate this with the appropriate metadata.
* Extend `INamedServiceProvider` to include this method `ICollection<T> GetServices<T>() where T : TService;`
* Extend `OpenAIKernelBuilderExtensions` so that `WithAzureXXX` methods will include a `modelId` property if a specific model can be targeted.
* Option #2
* Extend `IAIService` to include the following method:
* `T? GetAttributes<T>() where T : AIServiceAttributes;` which returns an instance of `AIServiceAttributes`. It will be the responsibility of each `IAIService` implementation to define it's own service attributes class and populate this with the appropriate values.
* Extend `INamedServiceProvider` to include this method `ICollection<T> GetServices<T>() where T : TService;`
* Extend `OpenAIKernelBuilderExtensions` so that `WithAzureXXX` methods will include a `modelId` property if a specific model can be targeted.
* Option #3
* Option #2
* Extend `IAIService` to include the following properties:
* `public IReadOnlyDictionary<string, object> Attributes => this.InternalAttributes;` which returns a read only dictionary. It will be the responsibility of each `IAIService` implementation to define it's own service attributes class and populate this with the appropriate values.
* `ModelId`
* `Endpoint`
* `ApiVersion`
* Extend `INamedServiceProvider` to include this method `ICollection<T> GetServices<T>() where T : TService;`
* Extend `OpenAIKernelBuilderExtensions` so that `WithAzureXXX` methods will include a `modelId` property if a specific model can be targeted.

These options would be used as follows:

As an SK developer I want to write a custom `IAIServiceSelector` which will select an AI service based on the model id because I want to restrict which LLM is used.
In the sample below the service selector implementation looks for the first service that is a GPT3 model.

### Option 1

``` csharp
public class Gpt3xAIServiceSelector : IAIServiceSelector
{
public (T?, AIRequestSettings?) SelectAIService<T>(string renderedPrompt, IAIServiceProvider serviceProvider, IReadOnlyList<AIRequestSettings>? modelSettings) where T : IAIService
{
var services = serviceProvider.GetServices<T>();
foreach (var service in services)
{
if (!string.IsNullOrEmpty(service.ModelId) && service.ModelId.StartsWith("gpt-3", StringComparison.OrdinalIgnoreCase))
{
Console.WriteLine($"Selected model: {service.ModelId}");
return (service, new OpenAIRequestSettings());
}
}

throw new SKException("Unable to find AI service for GPT 3.x.");
}
}
```

## Option 2

``` csharp
public class Gpt3xAIServiceSelector : IAIServiceSelector
{
public (T?, AIRequestSettings?) SelectAIService<T>(string renderedPrompt, IAIServiceProvider serviceProvider, IReadOnlyList<AIRequestSettings>? modelSettings) where T : IAIService
{
var services = serviceProvider.GetServices<T>();
foreach (var service in services)
{
var serviceModelId = service.GetAttributes<AIServiceAttributes>()?.ModelId;
if (!string.IsNullOrEmpty(serviceModelId) && serviceModelId.StartsWith("gpt-3", StringComparison.OrdinalIgnoreCase))
{
Console.WriteLine($"Selected model: {serviceModelId}");
return (service, new OpenAIRequestSettings());
}
}

throw new SKException("Unable to find AI service for GPT 3.x.");
}
}
```

## Option 3

```csharp
public (T?, AIRequestSettings?) SelectAIService<T>(string renderedPrompt, IAIServiceProvider serviceProvider, IReadOnlyList<AIRequestSettings>? modelSettings) where T : IAIService
{
var services = serviceProvider.GetServices<T>();
foreach (var service in services)
{
var serviceModelId = service.GetModelId();
var serviceOrganization = service.GetAttribute(OpenAIServiceAttributes.OrganizationKey);
var serviceDeploymentName = service.GetAttribute(AzureOpenAIServiceAttributes.DeploymentNameKey);
if (!string.IsNullOrEmpty(serviceModelId) && serviceModelId.StartsWith("gpt-3", StringComparison.OrdinalIgnoreCase))
{
Console.WriteLine($"Selected model: {serviceModelId}");
return (service, new OpenAIRequestSettings());
}
}

throw new SKException("Unable to find AI service for GPT 3.x.");
}
```

## Decision Outcome

Chosen option: Option 1, because it's a simple implementation and allows easy iteration over all possible attributes.
Expand Up @@ -300,7 +300,7 @@ private static SemanticTextMemory InitializeMemory()
var memoryStorage = new VolatileMemoryStore();

var textEmbeddingGenerator = new AzureOpenAITextEmbeddingGeneration(
modelId: TestConfiguration.AzureOpenAIEmbeddings.DeploymentName,
deploymentName: TestConfiguration.AzureOpenAIEmbeddings.DeploymentName,
endpoint: TestConfiguration.AzureOpenAIEmbeddings.Endpoint,
apiKey: TestConfiguration.AzureOpenAIEmbeddings.ApiKey);

Expand Down
6 changes: 6 additions & 0 deletions dotnet/samples/KernelSyntaxExamples/Example16_CustomLLM.cs
Expand Up @@ -29,8 +29,14 @@
*/
public class MyTextCompletionService : ITextCompletion
{
public string? ModelId { get; private set; }

public IReadOnlyDictionary<string, string> Attributes => new Dictionary<string, string>();

public Task<IReadOnlyList<ITextResult>> GetCompletionsAsync(string text, AIRequestSettings? requestSettings, CancellationToken cancellationToken = default)
{
this.ModelId = requestSettings?.ModelId;

return Task.FromResult<IReadOnlyList<ITextResult>>(new List<ITextResult>
{
new MyTextCompletionStreamingResult()
Expand Down
Expand Up @@ -20,6 +20,10 @@
*/
public sealed class MyChatCompletionService : IChatCompletion
{
public string? ModelId { get; private set; }

public IReadOnlyDictionary<string, string> Attributes => new Dictionary<string, string>();

public ChatHistory CreateNewChat(string? instructions = null)
{
var chatHistory = new MyChatHistory();
Expand Down
Expand Up @@ -73,7 +73,7 @@ public static Task RunAsync()
var loggerFactory = NullLoggerFactory.Instance;
var memoryStorage = new VolatileMemoryStore();
var textEmbeddingGenerator = new AzureOpenAITextEmbeddingGeneration(
modelId: azureOpenAIEmbeddingDeployment,
deploymentName: azureOpenAIEmbeddingDeployment,
endpoint: azureOpenAIEndpoint,
apiKey: azureOpenAIKey,
loggerFactory: loggerFactory);
Expand All @@ -88,11 +88,11 @@ public static Task RunAsync()
using var httpClient = new HttpClient(httpHandler);
var aiServices = new AIServiceCollection();
ITextCompletion Factory() => new AzureOpenAIChatCompletion(
modelId: azureOpenAIChatCompletionDeployment,
deploymentName: azureOpenAIChatCompletionDeployment,
endpoint: azureOpenAIEndpoint,
apiKey: azureOpenAIKey,
httpClient,
loggerFactory);
httpClient: httpClient,
loggerFactory: loggerFactory);
aiServices.SetService("foo", Factory);
IAIServiceProvider aiServiceProvider = aiServices.Build();

Expand Down
2 changes: 1 addition & 1 deletion dotnet/samples/KernelSyntaxExamples/Example52_ApimAuth.cs
Expand Up @@ -62,7 +62,7 @@ public static async Task RunAsync()
var kernel = new KernelBuilder()
.WithLoggerFactory(loggerFactory)
.WithAIService<IChatCompletion>(TestConfiguration.AzureOpenAI.ChatDeploymentName, (loggerFactory) =>
new AzureOpenAIChatCompletion(TestConfiguration.AzureOpenAI.ChatDeploymentName, openAIClient, loggerFactory))
new AzureOpenAIChatCompletion(deploymentName: TestConfiguration.AzureOpenAI.ChatDeploymentName, openAIClient: openAIClient, loggerFactory: loggerFactory))
.Build();

// Load semantic plugin defined with prompt templates
Expand Down
64 changes: 53 additions & 11 deletions dotnet/samples/KernelSyntaxExamples/Example61_MultipleLLMs.cs
@@ -1,9 +1,11 @@
// Copyright (c) Microsoft. All rights reserved.

using System;
using System.Collections.Generic;
using System.Threading.Tasks;
using Microsoft.SemanticKernel;
using Microsoft.SemanticKernel.AI;
using Microsoft.SemanticKernel.TemplateEngine;
using RepoUtils;

// ReSharper disable once InconsistentNaming
Expand All @@ -16,11 +18,12 @@ public static async Task RunAsync()
{
Console.WriteLine("======== Example61_MultipleLLMs ========");

string apiKey = TestConfiguration.AzureOpenAI.ApiKey;
string chatDeploymentName = TestConfiguration.AzureOpenAI.ChatDeploymentName;
string endpoint = TestConfiguration.AzureOpenAI.Endpoint;
string azureApiKey = TestConfiguration.AzureOpenAI.ApiKey;
string azureDeploymentName = TestConfiguration.AzureOpenAI.ChatDeploymentName;
string azureModelId = TestConfiguration.AzureOpenAI.ChatModelId;
string azureEndpoint = TestConfiguration.AzureOpenAI.Endpoint;

if (apiKey == null || chatDeploymentName == null || endpoint == null)
if (azureApiKey == null || azureDeploymentName == null || azureEndpoint == null)
{
Console.WriteLine("AzureOpenAI endpoint, apiKey, or deploymentName not found. Skipping example.");
return;
Expand All @@ -38,23 +41,25 @@ public static async Task RunAsync()
IKernel kernel = new KernelBuilder()
.WithLoggerFactory(ConsoleLogger.LoggerFactory)
.WithAzureOpenAIChatCompletionService(
deploymentName: chatDeploymentName,
endpoint: endpoint,
deploymentName: azureDeploymentName,
endpoint: azureEndpoint,
serviceId: "AzureOpenAIChat",
apiKey: apiKey)
modelId: azureModelId,
apiKey: azureApiKey)
.WithOpenAIChatCompletionService(
modelId: openAIModelId,
serviceId: "OpenAIChat",
apiKey: openAIApiKey)
.Build();

await RunSemanticFunctionAsync(kernel, "AzureOpenAIChat");
await RunSemanticFunctionAsync(kernel, "OpenAIChat");
await RunByServiceIdAsync(kernel, "AzureOpenAIChat");
await RunByModelIdAsync(kernel, openAIModelId);
await RunByFirstModelIdAsync(kernel, "gpt-4-1106-preview", azureModelId, openAIModelId);
}

public static async Task RunSemanticFunctionAsync(IKernel kernel, string serviceId)
public static async Task RunByServiceIdAsync(IKernel kernel, string serviceId)
{
Console.WriteLine($"======== {serviceId} ========");
Console.WriteLine($"======== Service Id: {serviceId} ========");

var prompt = "Hello AI, what can you do for me?";

Expand All @@ -66,4 +71,41 @@ public static async Task RunSemanticFunctionAsync(IKernel kernel, string service
});
Console.WriteLine(result.GetValue<string>());
}

public static async Task RunByModelIdAsync(IKernel kernel, string modelId)
{
Console.WriteLine($"======== Model Id: {modelId} ========");

var prompt = "Hello AI, what can you do for me?";

var result = await kernel.InvokeSemanticFunctionAsync(
prompt,
requestSettings: new AIRequestSettings()
{
ModelId = modelId
});
Console.WriteLine(result.GetValue<string>());
}

public static async Task RunByFirstModelIdAsync(IKernel kernel, params string[] modelIds)
{
Console.WriteLine($"======== Model Ids: {string.Join(", ", modelIds)} ========");

var prompt = "Hello AI, what can you do for me?";

var modelSettings = new List<AIRequestSettings>();
foreach (var modelId in modelIds)
{
modelSettings.Add(new AIRequestSettings() { ModelId = modelId });
}
var promptTemplateConfig = new PromptTemplateConfig() { ModelSettings = modelSettings };

var skfunction = kernel.RegisterSemanticFunction(
"HelloAI",
prompt,
promptTemplateConfig);

var result = await kernel.RunAsync(skfunction);
Console.WriteLine(result.GetValue<string>());
}
}

0 comments on commit 5e5b4a4

Please sign in to comment.