Skip to content

Mock Metadata Usage in a GetStreamingChatMessageContentsAsync response #10638

Answered by dmytrostruk
romainsemur asked this question in Q&A

You must be logged in to vote

@romainsemur Got it, thanks for the additional details. In this case, taking into account that OpenAI.Chat.ChatTokenUsage model doesn't have public constructor, you can use OpenAIChatModelFactory.ChatTokenUsage() method to initialize it on your side for testing purposes. Here is a full code example including streaming scenario:

[Fact]
public async Task DoWorkWithPrompt()
{
    var mockUsage = OpenAIChatModelFactory.ChatTokenUsage(
        outputTokenCount: 242,
        inputTokenCount: 18,
        totalTokenCount: 260);

    var mockResponse = new StreamingChatMessageContent(
        AuthorRole.Assistant,
        "AI response",
        metadata: new Dictionary<string, object?>
        {

Replies: 2 comments 3 replies

You must be logged in to vote
0 replies

You must be logged in to vote
3 replies
@romainsemur

@dmytrostruk

Answer selected by sophialagerkranspandey
@romainsemur

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Category
Q&A
Labels
None yet
3 participants