Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Not sure if it's a bug or a configuration error... #1555

Open
0wwafa opened this issue Jun 26, 2024 · 0 comments
Open

Not sure if it's a bug or a configuration error... #1555

0wwafa opened this issue Jun 26, 2024 · 0 comments

Comments

@0wwafa
Copy link

0wwafa commented Jun 26, 2024

With the same model (Mistral Instruct v03)
If I use llama.cpp (any version of the last 3 weeks) and the same prompt,
if I say "hello, how are you?" it answers "I am fine, and you?" (or similar)

If I do it using gradio and llama_cpp_python
I get
"I am an assistant.....blah blah.. I don't have feelings... blah blah"

I even took all the parameters from llama.cpp and set the same parameters in llama_cpp_python and the gradio interface.
I really don't understand what might be the difference.

@0wwafa 0wwafa changed the title Nosure if it's a bug or a configuration error... Not sure if it's a bug or a configuration error... Jun 26, 2024
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

1 participant