Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

The conversation replied with garbled code #1269

Open
A-runaaaa opened this issue May 16, 2023 · 5 comments
Open

The conversation replied with garbled code #1269

A-runaaaa opened this issue May 16, 2023 · 5 comments

Comments

@A-runaaaa
Copy link

image
Using commands python3 -m fastchat.serve.cli --model-path call the dialogue robot opened by the model ,what is the reason for all the replies being garbled.

@0xTong
Copy link

0xTong commented May 16, 2023

I also faced similar questions!

@Kaka23333
Copy link

I faced this question too. Is there any one has solutions?

@0xTong
Copy link

0xTong commented May 22, 2023

You need to add the llama weight together to use the vicuna model. I hope that help u.

@A-runaaaa
Copy link
Author

Using only llama's weights did not cause garbled errors, but using vicuna7b_ 1.1 will result in garbled code

@surak
Copy link
Collaborator

surak commented Oct 23, 2023

This is related to the model. Did you tune the model yourself? Did you tune it for instructions?

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

4 participants