Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Support for Multi Model Detailed Response (Max Tokens, Prompt Tokens, etc) #1181

Merged

Conversation

RogerBarreto
Copy link
Member

Motivation and Context

Common feature request and missing capability to acquire the actual details from the models we are integrating.

Resolves #802
Resolves #618
Resolves Partially #351
Getting the results alternative to #693

Description

This change introduces a new Property in object? SKContext.LastPromptResults which will contain a list of each result detail for a given prompt.

The Connector namespace will also provide a Extension methods that converts this object in the actual class that you will be using to serialize or get the detailed information about the Model result.

Normal suggested usage

var textResult = await excuseFunction.InvokeAsync("I missed the F1 final race");

Getting model result as json:

var modelResultJson = JsonSerializer.Serialize(textResult.LastPromptResults);

Getting model result as a traversable object (using a connector extension):

var modelResultJson = textResult.GetOpenAILastPromptResult()?.Usage.TotalTokens;

Contribution Checklist

@RogerBarreto RogerBarreto changed the title Support for Multi Model Detailed Response (Max Tokens, Prompt Tokens, etc) Support for Multi Model/Modal Detailed Response (Max Tokens, Prompt Tokens, etc) May 23, 2023
@RogerBarreto RogerBarreto changed the title Support for Multi Model/Modal Detailed Response (Max Tokens, Prompt Tokens, etc) Support for Multi Model Detailed Response (Max Tokens, Prompt Tokens, etc) May 23, 2023
@github-actions github-actions bot added .NET Issue or Pull requests regarding .NET code kernel Issues or pull requests impacting the core kernel kernel.core labels May 23, 2023
@RogerBarreto RogerBarreto requested review from dluc and shawncal and removed request for dluc May 23, 2023 18:25
@RogerBarreto RogerBarreto self-assigned this May 23, 2023
@shawncal
Copy link
Member

var modelResultJson = JsonSerializer.Serialize(textResult.LastPromptResults);

Can we just offer this as something like textResult.LastPromptResults.AsJson()? Or is there a good reason to have you bring your own serializer?

@shawncal
Copy link
Member

good to see this! I just took a quick glance, but I'll spend some time trying it out tomorrow.

…ption captured in the ClienBase is passed as innerexception to the AIException
@RogerBarreto RogerBarreto added enhancement PR: ready to merge PR has been approved by all reviewers, and is ready to merge. labels May 30, 2023
@lemillermicrosoft lemillermicrosoft enabled auto-merge (squash) June 7, 2023 16:49
@lemillermicrosoft lemillermicrosoft merged commit bc5d901 into microsoft:main Jun 7, 2023
shawncal pushed a commit to shawncal/semantic-kernel that referenced this pull request Jul 6, 2023
… etc) (microsoft#1181)

### Motivation and Context

Common feature request and missing capability to acquire the actual
details from the models we are integrating.

Resolves microsoft#802
Resolves microsoft#618  
Resolves Partially microsoft#351 
Getting the results alternative to microsoft#693 

### Description

This change introduces a new Property in `object?
SKContext.LastPromptResults` which will contain a list of each result
detail for a given prompt.

The Connector namespace will also provide a Extension methods that
converts this object in the actual class that you will be using to
serialize or get the detailed information about the Model result.

Normal suggested usage 
```
var textResult = await excuseFunction.InvokeAsync("I missed the F1 final race");
```

Getting model result as json:
```
var modelResultJson = JsonSerializer.Serialize(textResult.LastPromptResults);
```

Getting model result as a traversable object (using a connector
extension):
```
var modelResultJson = textResult.GetOpenAILastPromptResult()?.Usage.TotalTokens;
```
@RogerBarreto RogerBarreto deleted the features/get-model-response branch December 5, 2023 13:44
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
kernel Issues or pull requests impacting the core kernel .NET Issue or Pull requests regarding .NET code PR: ready to merge PR has been approved by all reviewers, and is ready to merge.
Projects
None yet
Development

Successfully merging this pull request may close these issues.

Get the tokens in the response Token usage stats from responses should be returned to the client.
4 participants