Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Fix context generation order in chat skill #653

Conversation

TaoChenOSU
Copy link
Contributor

Motivation and Context

The chat function generates the final prompt by populating it with multiple contexts. Each context is constrained by a token limit except the last one, which is chat history. The chat function will fill up the prompt with chat history to reach the token limit.

A bug was discovered that the chat history context was not the last context.

Description

Move chat history to the end of the prompt.

Contribution Checklist

@TaoChenOSU TaoChenOSU added PR: ready for review All feedback addressed, ready for reviews samples labels Apr 25, 2023
@adrianwyatt adrianwyatt self-assigned this Apr 25, 2023
@adrianwyatt adrianwyatt added PR: ready to merge PR has been approved by all reviewers, and is ready to merge. and removed PR: ready for review All feedback addressed, ready for reviews labels Apr 25, 2023
@adrianwyatt adrianwyatt enabled auto-merge (squash) April 25, 2023 22:47
@adrianwyatt adrianwyatt merged commit d6fd518 into microsoft:main Apr 25, 2023
11 checks passed
dluc pushed a commit that referenced this pull request Apr 29, 2023
### Motivation and Context
The chat function generates the final prompt by populating it with
multiple contexts. Each context is constrained by a token limit except
the last one, which is chat history. The chat function will fill up the
prompt with chat history to reach the token limit.

A bug was discovered that the chat history context was not the last
context.

### Description
Move chat history to the end of the prompt.
dehoward pushed a commit to lemillermicrosoft/semantic-kernel that referenced this pull request Jun 1, 2023
### Motivation and Context
The chat function generates the final prompt by populating it with
multiple contexts. Each context is constrained by a token limit except
the last one, which is chat history. The chat function will fill up the
prompt with chat history to reach the token limit.

A bug was discovered that the chat history context was not the last
context.

### Description
Move chat history to the end of the prompt.
golden-aries pushed a commit to golden-aries/semantic-kernel that referenced this pull request Oct 10, 2023
### Motivation and Context
The chat function generates the final prompt by populating it with
multiple contexts. Each context is constrained by a token limit except
the last one, which is chat history. The chat function will fill up the
prompt with chat history to reach the token limit.

A bug was discovered that the chat history context was not the last
context.

### Description
Move chat history to the end of the prompt.
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
PR: ready to merge PR has been approved by all reviewers, and is ready to merge.
Projects
None yet
Development

Successfully merging this pull request may close these issues.

None yet

2 participants