-
Notifications
You must be signed in to change notification settings - Fork 684
Fix duplicating latest prompt #5546
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Conversation
🔗 Helpful Links🧪 See artifacts and rendered test results at hud.pytorch.org/pr/pytorch/executorch/5546
Note: Links to docs will display an error until the docs builds have been completed. ✅ No FailuresAs of commit 793aaeb with merge base 0ec003b ( This comment was automatically generated by Dr. CI and updates every 15 minutes. |
This pull request was exported from Phabricator. Differential Revision: D62761977 |
Summary: Pull Request resolved: #5546 The last prompt sent would be included in `getConversationHistory()` + adding it prior to sending it with the generate(). It looks like this got move during the rebasing. To fix this we now call `getConversationHistory()` prior to adding the rawPrompt to a Message. In regards to model response, I noticed that it did not really change the quality of the response. (tested with Llama 3.1) Reviewed By: Riandy Differential Revision: D62761977
This pull request was exported from Phabricator. Differential Revision: D62761977 |
9faf2be
to
793aaeb
Compare
This pull request has been merged in 3b63839. |
@pytorchbot cherry-pick --onto release/0.4 -c fixnewfeature |
Summary: Pull Request resolved: #5546 The last prompt sent would be included in `getConversationHistory()` + adding it prior to sending it with the generate(). It looks like this got move during the rebasing. To fix this we now call `getConversationHistory()` prior to adding the rawPrompt to a Message. In regards to model response, I noticed that it did not really change the quality of the response. (tested with Llama 3.1) Reviewed By: Riandy Differential Revision: D62761977 fbshipit-source-id: 2f975983965fe837147f1ffb8b5dcfa8f2061895 (cherry picked from commit 3b63839)
Cherry picking #5546The cherry pick PR is at #5568 and it is recommended to link a fixnewfeature cherry pick PR with an issue. The following tracker issues are updated: Details for Dev Infra teamRaised by workflow job |
Fix duplicating latest prompt (#5546) Summary: Pull Request resolved: #5546 The last prompt sent would be included in `getConversationHistory()` + adding it prior to sending it with the generate(). It looks like this got move during the rebasing. To fix this we now call `getConversationHistory()` prior to adding the rawPrompt to a Message. In regards to model response, I noticed that it did not really change the quality of the response. (tested with Llama 3.1) Reviewed By: Riandy Differential Revision: D62761977 fbshipit-source-id: 2f975983965fe837147f1ffb8b5dcfa8f2061895 (cherry picked from commit 3b63839) Co-authored-by: Chirag Modi <cmodi@meta.com>
Summary:
The last prompt sent would be included in
getConversationHistory()
+ adding it prior to sending it with the generate(). It looks like this got move during the rebasing.To fix this we now call
getConversationHistory()
prior to adding the rawPrompt to a Message.In regards to model response, I noticed that it did not really change the quality of the response. (tested with Llama 3.1)
Reviewed By: Riandy
Differential Revision: D62761977