Skip to content

Commit

Permalink
.Net: Update telemetry sample and documentation (#6191)
Browse files Browse the repository at this point in the history
### Motivation and Context

<!-- Thank you for your contribution to the semantic-kernel repo!
Please help reviewers and future users, providing the following
information:
  1. Why is this change required?
  2. What problem does it solve?
  3. What scenario does it contribute to?
  4. If it fixes an open issue, please link to the issue here.
-->
SK has included the OTel semantic conventions as an experimental
feature.

### Description

<!-- Describe your changes, the overall approach, the underlying design.
These notes will help understanding how your code works. Thanks! -->

This PR updates the telemetry sample app to show case the feature and
removes the use of planners in the sample app as not all connectors work
with the Handlebars planner (The Handlebars planner has multiple system
messages, but the Gemini connector doesn't allow that).

This PR also updates the documentations for telemetry.
### Contribution Checklist

<!-- Before submitting this PR, please make sure: -->

- [ ] The code builds clean without any errors or warnings
- [ ] The PR follows the [SK Contribution
Guidelines](https://github.com/microsoft/semantic-kernel/blob/main/CONTRIBUTING.md)
and the [pre-submission formatting
script](https://github.com/microsoft/semantic-kernel/blob/main/CONTRIBUTING.md#development-scripts)
raises no violations
- [ ] All unit tests pass, and I have added new tests where possible
- [ ] I didn't break anyone 😄
  • Loading branch information
TaoChenOSU committed May 13, 2024
1 parent 056d73b commit f53c98e
Show file tree
Hide file tree
Showing 6 changed files with 253 additions and 45 deletions.
8 changes: 4 additions & 4 deletions docs/decisions/0044-OTel-semantic-convention.md
Original file line number Diff line number Diff line change
Expand Up @@ -58,13 +58,13 @@ block-beta
columns 1
Models
blockArrowId1<["&nbsp;&nbsp;&nbsp;"]>(y)
block:Connectors
block:Clients
columns 3
ConnectorTypeClientA["Instrumented client SDK<br>(i.e. Azure OpenAI client)"]
ConnectorTypeClientB["Un-instrumented Client SDK"]
ConnectorTypeClientC["Custom client on REST API<br>(i.e. HuggingFaceClient)"]
end
Services["AI Services"]
Connectors["AI Connectors"]
blockArrowId2<["&nbsp;&nbsp;&nbsp;"]>(y)
SemanticKernel["Semantic Kernel"]
block:Kernel
Expand Down Expand Up @@ -259,8 +259,8 @@ internal static class ModelDiagnostics
private static readonly string s_namespace = typeof(ModelDiagnostics).Namespace;
private static readonly ActivitySource s_activitySource = new(s_namespace);

private const string EnableModelDiagnosticsSettingName = "Microsoft.SemanticKernel.Experimental.EnableModelDiagnostics";
private const string EnableSensitiveEventsSettingName = "Microsoft.SemanticKernel.Experimental.EnableModelDiagnosticsWithSensitiveData";
private const string EnableModelDiagnosticsSettingName = "Microsoft.SemanticKernel.Experimental.GenAI.EnableOTelDiagnostics";
private const string EnableSensitiveEventsSettingName = "Microsoft.SemanticKernel.Experimental.GenAI.EnableOTelDiagnosticsSensitive";

private static readonly bool s_enableSensitiveEvents = AppContextSwitchHelper.GetConfigValue(EnableSensitiveEventsSettingName);
private static readonly bool s_enableModelDiagnostics = AppContextSwitchHelper.GetConfigValue(EnableModelDiagnosticsSettingName) || s_enableSensitiveEvents;
Expand Down
6 changes: 3 additions & 3 deletions dotnet/docs/TELEMETRY.md
Original file line number Diff line number Diff line change
@@ -1,9 +1,9 @@
# Telemetry

Telemetry in Semantic Kernel (SK) .NET implementation includes _logging_, _metering_ and _tracing_.
The code is instrumented using native .NET instrumentation tools, which means that it's possible to use different monitoring platforms (e.g. Application Insights, Prometheus, Grafana etc.).
The code is instrumented using native .NET instrumentation tools, which means that it's possible to use different monitoring platforms (e.g. Application Insights, Aspire dashboard, Prometheus, Grafana etc.).

Code example using Application Insights can be found [here](https://github.com/microsoft/semantic-kernel/blob/main/dotnet/samples/TelemetryExample).
Code example using Application Insights can be found [here](../samples/Demos/TelemetryWithAppInsights/).

## Logging

Expand Down Expand Up @@ -108,7 +108,7 @@ Tracing is implemented with `Activity` class from `System.Diagnostics` namespace
Available activity sources:

- _Microsoft.SemanticKernel.Planning_ - creates activities for all planners.
- _Microsoft.SemanticKernel_ - creates activities for `KernelFunction`.
- _Microsoft.SemanticKernel_ - creates activities for `KernelFunction` as well as requests to models.

### Examples

Expand Down
203 changes: 170 additions & 33 deletions dotnet/samples/Demos/TelemetryWithAppInsights/Program.cs
Original file line number Diff line number Diff line change
Expand Up @@ -2,55 +2,56 @@

using System;
using System.Diagnostics;
using System.Diagnostics.CodeAnalysis;
using System.IO;
using System.Linq;
using System.Threading.Tasks;
using Azure.Monitor.OpenTelemetry.Exporter;
using Microsoft.Extensions.Configuration;
using Microsoft.Extensions.DependencyInjection;
using Microsoft.Extensions.Logging;
using Microsoft.SemanticKernel;
using Microsoft.SemanticKernel.Planning.Handlebars;
using Microsoft.SemanticKernel.Connectors.Google;
using Microsoft.SemanticKernel.Connectors.HuggingFace;
using Microsoft.SemanticKernel.Connectors.OpenAI;
using Microsoft.SemanticKernel.Services;
using OpenTelemetry;
using OpenTelemetry.Logs;
using OpenTelemetry.Metrics;
using OpenTelemetry.Resources;
using OpenTelemetry.Trace;

/// <summary>
/// Example of telemetry in Semantic Kernel using Application Insights within console application.
/// </summary>
public sealed class Program
{
/// <summary>
/// Log level to be used by <see cref="ILogger"/>.
/// </summary>
/// <remarks>
/// <see cref="LogLevel.Information"/> is set by default. <para />
/// <see cref="LogLevel.Trace"/> will enable logging with more detailed information, including sensitive data. Should not be used in production. <para />
/// </remarks>
private const LogLevel MinLogLevel = LogLevel.Information;

/// <summary>
/// Instance of <see cref="ActivitySource"/> for the application activities.
/// </summary>
private static readonly ActivitySource s_activitySource = new("Telemetry.Example");

/// <summary>
/// The main entry point for the application.
/// </summary>
/// <returns>A <see cref="Task"/> representing the asynchronous operation.</returns>
public static async Task Main()
{
// Enable model diagnostics with sensitive data.
AppContext.SetSwitch("Microsoft.SemanticKernel.Experimental.GenAI.EnableOTelDiagnosticsSensitive", true);

// Load configuration from environment variables or user secrets.
LoadUserSecrets();

var connectionString = TestConfiguration.ApplicationInsights.ConnectionString;
var resourceBuilder = ResourceBuilder
.CreateDefault()
.AddService("TelemetryExample");

using var traceProvider = Sdk.CreateTracerProviderBuilder()
.SetResourceBuilder(resourceBuilder)
.AddSource("Microsoft.SemanticKernel*")
.AddSource("Telemetry.Example")
.AddAzureMonitorTraceExporter(options => options.ConnectionString = connectionString)
.Build();

using var meterProvider = Sdk.CreateMeterProviderBuilder()
.SetResourceBuilder(resourceBuilder)
.AddMeter("Microsoft.SemanticKernel*")
.AddAzureMonitorMetricExporter(options => options.ConnectionString = connectionString)
.Build();
Expand All @@ -60,30 +61,117 @@ public static async Task Main()
// Add OpenTelemetry as a logging provider
builder.AddOpenTelemetry(options =>
{
options.SetResourceBuilder(resourceBuilder);
options.AddAzureMonitorLogExporter(options => options.ConnectionString = connectionString);
// Format log messages. This is default to false.
options.IncludeFormattedMessage = true;
options.IncludeScopes = true;
});
builder.SetMinimumLevel(MinLogLevel);
});

var kernel = GetKernel(loggerFactory);
var planner = CreatePlanner();

using var activity = s_activitySource.StartActivity("Main");
Console.WriteLine($"Operation/Trace ID: {Activity.Current?.TraceId}");
Console.WriteLine();

Console.WriteLine("Operation/Trace ID:");
Console.WriteLine(Activity.Current?.TraceId);
Console.WriteLine("Write a poem about John Doe and translate it to Italian.");
await RunAzureOpenAIChatAsync(kernel);
Console.WriteLine();
await RunGoogleAIChatAsync(kernel);
Console.WriteLine();
await RunHuggingFaceChatAsync(kernel);
}

var plan = await planner.CreatePlanAsync(kernel, "Write a poem about John Doe, then translate it into Italian.");
#region Private
/// <summary>
/// Log level to be used by <see cref="ILogger"/>.
/// </summary>
/// <remarks>
/// <see cref="LogLevel.Information"/> is set by default. <para />
/// <see cref="LogLevel.Trace"/> will enable logging with more detailed information, including sensitive data. Should not be used in production. <para />
/// </remarks>
private const LogLevel MinLogLevel = LogLevel.Information;

Console.WriteLine("Original plan:");
Console.WriteLine(plan.ToString());
/// <summary>
/// Instance of <see cref="ActivitySource"/> for the application activities.
/// </summary>
private static readonly ActivitySource s_activitySource = new("Telemetry.Example");

var result = await plan.InvokeAsync(kernel).ConfigureAwait(false);
private const string AzureOpenAIChatServiceKey = "AzureOpenAIChat";
private const string GoogleAIGeminiChatServiceKey = "GoogleAIGeminiChat";
private const string HuggingFaceChatServiceKey = "HuggingFaceChat";

Console.WriteLine("Result:");
Console.WriteLine(result);
private static async Task RunAzureOpenAIChatAsync(Kernel kernel)
{
Console.WriteLine("============= Azure OpenAI Chat Completion =============");

using var activity = s_activitySource.StartActivity(AzureOpenAIChatServiceKey);
SetTargetService(kernel, AzureOpenAIChatServiceKey);
try
{
await RunChatAsync(kernel);
}
catch (Exception ex)
{
activity?.SetStatus(ActivityStatusCode.Error, ex.Message);
Console.WriteLine($"Error: {ex.Message}");
}
}

private static async Task RunGoogleAIChatAsync(Kernel kernel)
{
Console.WriteLine("============= Google Gemini Chat Completion =============");

using var activity = s_activitySource.StartActivity(GoogleAIGeminiChatServiceKey);
SetTargetService(kernel, GoogleAIGeminiChatServiceKey);

try
{
await RunChatAsync(kernel);
}
catch (Exception ex)
{
activity?.SetStatus(ActivityStatusCode.Error, ex.Message);
Console.WriteLine($"Error: {ex.Message}");
}
}

private static async Task RunHuggingFaceChatAsync(Kernel kernel)
{
Console.WriteLine("============= HuggingFace Chat Completion =============");

using var activity = s_activitySource.StartActivity(HuggingFaceChatServiceKey);
SetTargetService(kernel, HuggingFaceChatServiceKey);

try
{
await RunChatAsync(kernel);
}
catch (Exception ex)
{
activity?.SetStatus(ActivityStatusCode.Error, ex.Message);
Console.WriteLine($"Error: {ex.Message}");
}
}

private static async Task RunChatAsync(Kernel kernel)
{
var poem = await kernel.InvokeAsync<string>(
"WriterPlugin",
"ShortPoem",
new KernelArguments { ["input"] = "Write a poem about John Doe." });
var translatedPoem = await kernel.InvokeAsync<string>(
"WriterPlugin",
"Translate",
new KernelArguments
{
["input"] = poem,
["language"] = "Italian"
});

Console.WriteLine($"Poem:\n{poem}\n\nTranslated Poem:\n{translatedPoem}");
}

private static Kernel GetKernel(ILoggerFactory loggerFactory)
Expand All @@ -93,22 +181,39 @@ private static Kernel GetKernel(ILoggerFactory loggerFactory)
IKernelBuilder builder = Kernel.CreateBuilder();

builder.Services.AddSingleton(loggerFactory);
builder.AddAzureOpenAIChatCompletion(
deploymentName: TestConfiguration.AzureOpenAI.ChatDeploymentName,
modelId: TestConfiguration.AzureOpenAI.ChatModelId,
endpoint: TestConfiguration.AzureOpenAI.Endpoint,
apiKey: TestConfiguration.AzureOpenAI.ApiKey
).Build();
builder
.AddAzureOpenAIChatCompletion(
deploymentName: TestConfiguration.AzureOpenAI.ChatDeploymentName,
modelId: TestConfiguration.AzureOpenAI.ChatModelId,
endpoint: TestConfiguration.AzureOpenAI.Endpoint,
apiKey: TestConfiguration.AzureOpenAI.ApiKey,
serviceId: AzureOpenAIChatServiceKey)
.AddGoogleAIGeminiChatCompletion(
modelId: TestConfiguration.GoogleAI.Gemini.ModelId,
apiKey: TestConfiguration.GoogleAI.ApiKey,
serviceId: GoogleAIGeminiChatServiceKey)
.AddHuggingFaceChatCompletion(
model: TestConfiguration.HuggingFace.ModelId,
endpoint: new Uri("https://api-inference.huggingface.co"),
apiKey: TestConfiguration.HuggingFace.ApiKey,
serviceId: HuggingFaceChatServiceKey);

builder.Services.AddSingleton<IAIServiceSelector>(new AIServiceSelector());
builder.Plugins.AddFromPromptDirectory(Path.Combine(folder, "WriterPlugin"));

return builder.Build();
}

private static HandlebarsPlanner CreatePlanner()
private static void SetTargetService(Kernel kernel, string targetServiceKey)
{
var plannerOptions = new HandlebarsPlannerOptions();
return new HandlebarsPlanner(plannerOptions);
if (kernel.Data.ContainsKey("TargetService"))
{
kernel.Data["TargetService"] = targetServiceKey;
}
else
{
kernel.Data.Add("TargetService", targetServiceKey);
}
}

private static void LoadUserSecrets()
Expand All @@ -119,4 +224,36 @@ private static void LoadUserSecrets()
.Build();
TestConfiguration.Initialize(configRoot);
}

private sealed class AIServiceSelector : IAIServiceSelector
{
public bool TrySelectAIService<T>(
Kernel kernel, KernelFunction function, KernelArguments arguments,
[NotNullWhen(true)] out T? service, out PromptExecutionSettings? serviceSettings) where T : class, IAIService
{
var targetServiceKey = kernel.Data.TryGetValue("TargetService", out object? value) ? value : null;
if (targetServiceKey is not null)
{
var targetService = kernel.Services.GetKeyedServices<T>(targetServiceKey).FirstOrDefault();
if (targetService is not null)
{
service = targetService;
serviceSettings = targetServiceKey switch
{
AzureOpenAIChatServiceKey => new OpenAIPromptExecutionSettings(),
GoogleAIGeminiChatServiceKey => new GeminiPromptExecutionSettings(),
HuggingFaceChatServiceKey => new HuggingFacePromptExecutionSettings(),
_ => null,
};

return true;
}
}

service = null;
serviceSettings = null;
return false;
}
}
#endregion
}

0 comments on commit f53c98e

Please sign in to comment.