Skip to content

Conversation

44670
Copy link
Contributor

@44670 44670 commented May 4, 2023

Hi!

I would like to submit a pull request to your project for adding the "--in-suffix" option. This option will allow users to specify the input file suffix when running the LLM model on CPU.

As a team working on OpenBuddy, a multilingual open model with the ability to understand user's questions and generate creative contents, we find llama.cpp's project incredibly useful for running LLM models on personal hardware. We appreciate your hard work and dedication in creating this project.

Attached is a screenshot showing successful testing on our own model. We follow a prompt format of "User: [question]\nAssistant:", which requires us to add "Assistant:" after the user's input in interactive mode for the model to understand the role switch and output the answer correctly.

Thank you for considering this pull request. We look forward to contributing to the llama.cpp project.

image

@44670
Copy link
Contributor Author

44670 commented May 4, 2023

By the way, here is the command-line that we have been testing:

bin/main -m 7b-q4_0.bin --interactive-first --reverse-prompt "User:" --in-prefix " " --in-suffix "Assistant:" -f prompt.txt

Please let me know if you need further explanation.

@44670
Copy link
Contributor Author

44670 commented May 4, 2023

Hi, I have added a new commit that will print the input-suffix on the screen before generating.

This will not affect the generated output, but it will enhance the user experience by making it feel more like a conversation.
image

Copy link
Member

@ggerganov ggerganov left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Feel free to add a link to OpenBuddy to the README!

@ggerganov ggerganov merged commit 2edbdb0 into ggml-org:master May 4, 2023
@44670
Copy link
Contributor Author

44670 commented May 4, 2023

Thanks for merging! And we are more than happy to add our model into the readme file!

We will make another pull request soon.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

2 participants