Skip to content

.Net: Bug: .Net: RenderedPrompt is empty when used with InvokeStreamingAsync #12243

Open
@Daniellled

Description

@Daniellled

Describe the bug
In an IFunctionInvocationFilter implementation the FunctionInvocationContext.Result.RendererdPrompt will be empty or populated depending on if InvokeStreamingAsync or InvokeStreaming is used.

To Reproduce
Steps to reproduce the behavior:

  1. Create a class an implement the IFunctionInvocationFilter interface
  2. Within the OnFunctionInvocationAsync display context.Result.RenderedPrompt prior to invoking the next function
  3. Within the OnFunctionInvocationAsync display context.Result.RenderedPrompt after invoking the next function
  4. Invoke the kernel using InvokeAsync and InvokeStreamingAsync

Expected behavior
It should not matter how the KernelFunction was invoked, the RendererdPrompt should be populated.

Screenshots

Non-Streaming:

*************************** Non Streaming ***************************
################################# Function Filter #################################
Before Next Call - RenderedPrompt:
After Next Call - RenderedPrompt: Question: What is New York?; Answer:

Streaming:

**************************** Streaming ******************************
################################# Function Filter #################################
Before Next Call - RenderedPrompt:
After Next Call - RenderedPrompt:

Platform

  • Language: C#
  • Source: Microsoft.SemanticKernel 1.54.0
  • AI model: llama3.2
  • IDE: VS Code
  • OS: Windows

Additional context
Sample application:

using Microsoft.Extensions.DependencyInjection;
using Microsoft.Extensions.Logging;
using Microsoft.SemanticKernel;

// Create a kernel with Azure OpenAI chat completion
IKernelBuilder builder = Kernel.CreateBuilder();
builder.AddOpenAIChatCompletion(
            modelId: "llama3.2",
            apiKey: "YOUR_API_KEY",
            httpClient: new HttpClient
        {
            Timeout = TimeSpan.FromMinutes(2),
            BaseAddress = new Uri("http://localhost:7869/v1")
        }
        );

// Add enterprise components
//builder.Services.AddLogging(services => services.AddConsole().SetMinimumLevel(LogLevel.Trace));

// Build the kernel
Kernel kernel = builder.Build();
kernel.FunctionInvocationFilters.Add(new FunctionFilter());

var questionAnswerFunction = kernel.CreateFunctionFromPrompt("Question: {{$input}}; Answer:");

// Non Streaming
Console.WriteLine("*************************** Non Streaming ***************************");
var result = await kernel.InvokeAsync(questionAnswerFunction, new() { ["input"] = "What is New York?" });
Console.WriteLine(result.GetValue<string>());

Console.WriteLine("**************************** Streaming ******************************");
await foreach (string text in kernel.InvokeStreamingAsync<string>(questionAnswerFunction, new() { ["input"] = "What is New York?" }))
{
    Console.Write(text);
}

public class FunctionFilter : IFunctionInvocationFilter
{
    public async Task OnFunctionInvocationAsync(FunctionInvocationContext context, Func<FunctionInvocationContext, Task> next)
    {
        Console.WriteLine("################################# Function Filter #################################");
        Console.WriteLine($"Before Next Call - RenderedPrompt: {context.Result.RenderedPrompt}");
        await next(context);        
        Console.WriteLine($"After Next Call - RenderedPrompt: {context.Result.RenderedPrompt}");
    }
}

Metadata

Metadata

Assignees

Labels

.NETIssue or Pull requests regarding .NET codebugSomething isn't working

Type

Projects

Status

Bug

Milestone

No milestone

Relationships

None yet

Development

No branches or pull requests

Issue actions