Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

--grammar usage #2364

Closed
ghost opened this issue Jul 24, 2023 · 5 comments
Closed

--grammar usage #2364

ghost opened this issue Jul 24, 2023 · 5 comments

Comments

@ghost
Copy link

ghost commented Jul 24, 2023

Hello,

I want to use the --grammar, --grammar-file parameters in llama.

I see examples, but they don't work as expected for a prompt with an Assistant, for example:

./main -m ~/llama2_7b_chat_uncensored.ggmlv3.q4_0.bin --color -c 2048 --keep -1 -n -1 -t 3 -b 7 -i -r "### HUMAN:" --in-prefix " " --in-suffix "### RESPONSE:" --grammar-file grammars/list.gbnf -p "### HUMAN: Hello, how are you?"

Will someone please help me figure out how to correctly use the grammar parameters? Essentially, I want the Assistant to communicate as usual, except uses astericks to display actions, for example, "### RESPONSE: shakes your hand

Is it possible?

@ejones
Copy link
Collaborator

ejones commented Jul 24, 2023

I'm having pretty good success with this very basic grammar (which would probably need fleshing out):

./main -m $LLAMA2_13B_Q4_0 --color -c 2048 --keep -1 -n -1 -t 3 -b 7 -i -r "### HUMAN:" --in-prefix " " --in-suffix "### RESPONSE:" \
--grammar 'root ::= " *" [a-z]+ (" " [a-z]+)* "* " [^\r\n]+ "\n### HUMAN:"' \
-p "### HUMAN: Hello, how are you?
### RESPONSE:" 

Here's an example of what I'm getting with llama 2 (base model; my understanding is the chat model requires specific formatting)

 ### HUMAN: Hello, how are you?
### RESPONSE: *waves* Good morning! I hope you will enjoy our game today.
### HUMAN: what is it
### RESPONSE: *points to poster with a picture of a piggybank* That's a piggybank! And it has gold in it! We play a game called "Treasure Hunt". You can win the treasure if you guess a word. Or lose, depending on how good are your guesses. Let me show you!
### HUMAN: 

Will elaborate more a bit later.

@ghost
Copy link
Author

ghost commented Jul 24, 2023

I'm having pretty good success with this very basic grammar (which would probably need fleshing out):

./main -m $LLAMA2_13B_Q4_0 --color -c 2048 --keep -1 -n -1 -t 3 -b 7 -i -r "### HUMAN:" --in-prefix " " --in-suffix "### RESPONSE:" \
--grammar 'root ::= " *" [a-z]+ (" " [a-z]+)* "* " [^\r\n]+ "\n### HUMAN:"' \
-p "### HUMAN: Hello, how are you?
### RESPONSE:" 

Ah, this is great, thank you! This helps me understand how I can mess around with it. 👍

Will elaborate more a bit later.

Your example is excellent, but any suggestion is welcome.

@ejones
Copy link
Collaborator

ejones commented Jul 25, 2023

Glad to help! Yeah I was going to say, in my example I include the reverse prompt in the grammar, which is necessary for it to allow the model to generate it.

For interest's sake, we can remove some of the prompt options and delegate more of that to the grammar. This form seems to work similarly, albeit with a blank space before the human input.

 ./main -m $LLAMA2_13B_Q4_0 --color -c 2048 --keep -1 -i -r '### HUMAN: ' \
--grammar 'root ::= "### RESPONSE: *" [a-z]+ (" " [a-z]+)* "* " [^\r\n]+ "\n"' \
-p "### HUMAN: Hello, how are you?
"

In this case, the grammar generates the input suffix, and triggers EOS instead of emitting the reverse prompt. Interactive mode then inserts a blank line and the reverse prompt on EOS. (In theory, it should be possible to even move the reverse prompt into grammar, but that doesn't seem to play nicely with interactive mode).

@ghost ghost closed this as completed Jul 25, 2023
@ejones
Copy link
Collaborator

ejones commented Jul 26, 2023

FYI as of #2304 you can actually use the grammar in place of -r

@ghost
Copy link
Author

ghost commented Jul 26, 2023

FYI as of #2304 you can actually use the grammar in place of -r

Thank you for the heads up, it wouldn't have occured to me. I'll try it.

This issue was closed.
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

1 participant