-
Notifications
You must be signed in to change notification settings - Fork 4.5k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
The conversation replied with garbled code #1269
Comments
I also faced similar questions! |
I faced this question too. Is there any one has solutions? |
You need to add the llama weight together to use the vicuna model. I hope that help u. |
Using only llama's weights did not cause garbled errors, but using vicuna7b_ 1.1 will result in garbled code |
This is related to the model. Did you tune the model yourself? Did you tune it for instructions? |
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Using commands python3 -m fastchat.serve.cli --model-path call the dialogue robot opened by the model ,what is the reason for all the replies being garbled.
The text was updated successfully, but these errors were encountered: