-
Notifications
You must be signed in to change notification settings - Fork 1.2k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Fix Baichuan2 prompt format #10334
Fix Baichuan2 prompt format #10334
Conversation
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Also need to update the results in the README?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Please also update the prompt format in https://github.com/intel-analytics/BigDL/blob/main/python/llm/example/GPU/PyTorch-Models/Model/baichuan2/generate.py#L25
Have updated README and |
# Though this is the official prompt format, we found it has problem generating English answers | ||
# If you want to ask English questions, we recommend you to change the prompt format |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
For English prompt, you are recommended to change the prompt format.
* Fix Baichuan2 prompt format * Fix Baichuan2 README * Change baichuan2 prompt info * Change baichuan2 prompt info
Description
Fix Baichuan2 prompt format referred from link1 and link2.