Mock Metadata Usage in a GetStreamingChatMessageContentsAsync response #10638
-
Hi, In this article about unit test (https://devblogs.microsoft.com/semantic-kernel/unit-testing-with-semantic-kernel/) , in the DoWorkWithPrompt() method, how to add a metadata with Usage values in the returns ? |
Beta Was this translation helpful? Give feedback.
Replies: 2 comments 3 replies
-
Tagging @dmytrostruk |
Beta Was this translation helpful? Give feedback.
-
Hi @romainsemur , [Fact]
public async Task DoWorkWithPrompt()
{
// Arrange
var mockChatCompletion = new Mock<IChatCompletionService>();
var mockResponse = new ChatMessageContent(
AuthorRole.Assistant,
"AI response",
metadata: new Dictionary<string, object?>
{
{ "Usage",
new Dictionary<string, int>
{
{ "InputTokenCount", 18 },
{ "OutputTokenCount", 242 },
{ "TotalTokenCount", 260 }
}
}
});
mockChatCompletion
.Setup(x => x.GetChatMessageContentsAsync(
It.IsAny<ChatHistory>(),
It.IsAny<PromptExecutionSettings>(),
It.IsAny<Kernel>(),
It.IsAny<CancellationToken>()))
.ReturnsAsync([mockResponse]);
var kernelBuilder = Kernel.CreateBuilder();
kernelBuilder.Services.AddSingleton(mockChatCompletion.Object);
var kernel = kernelBuilder.Build();
var service = new MyService(kernel);
// Act
var result = await service.DoWorkAsync("Prompt to AI");
// Assert
Assert.Equal("AI response", result.ToString());
if (result.Metadata is not null &&
result.Metadata.TryGetValue("Usage", out var metadataObj) &&
metadataObj is Dictionary<string, int> metadata)
{
Assert.Equal(18, metadata["InputTokenCount"]);
Assert.Equal(242, metadata["OutputTokenCount"]);
Assert.Equal(260, metadata["TotalTokenCount"]);
}
} Please let us know if that works for you, thanks! |
Beta Was this translation helpful? Give feedback.
@romainsemur Got it, thanks for the additional details. In this case, taking into account that
OpenAI.Chat.ChatTokenUsage
model doesn't have public constructor, you can useOpenAIChatModelFactory.ChatTokenUsage()
method to initialize it on your side for testing purposes. Here is a full code example including streaming scenario: