Skip to content
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
Show all changes
22 commits
Select commit Hold shift + click to select a range
dfeda32
infill tokens correction
vvhg1 Oct 6, 2023
8bd24b2
Merge branch 'ggerganov:master' into master
vvhg1 Oct 6, 2023
6796e74
serverinfill tokens correction
vvhg1 Oct 6, 2023
377be2f
removing any leading whitespace from infill suffix and removing leead…
vvhg1 Oct 6, 2023
b4046aa
removing any leading whitespace from infill suffix and removing leead…
vvhg1 Oct 6, 2023
0526560
only rm when params.escape, rm space if possible which is added back …
vvhg1 Oct 7, 2023
63ba0b6
only rm when params.escape, rm space if possible which is added back …
vvhg1 Oct 7, 2023
003c15b
Revert "only rm when params.escape, rm space if possible which is add…
vvhg1 Oct 7, 2023
fc01dc0
Merge branch 'master' of github.com:ggerganov/llama.cpp
vvhg1 Oct 7, 2023
c3a7f84
fix interactive prompt escaping and fix server infill leading space h…
vvhg1 Oct 7, 2023
b1b6bef
rm unnecessary bool check
vvhg1 Oct 7, 2023
4a21468
Merge branch 'master' of github.com:ggerganov/llama.cpp
vvhg1 Oct 7, 2023
d9dae93
Merge branch 'master' of github.com:ggerganov/llama.cpp
vvhg1 Oct 8, 2023
141329f
Merge branch 'master' of github.com:ggerganov/llama.cpp
vvhg1 Oct 8, 2023
3517729
Merge branch 'master' of github.com:ggerganov/llama.cpp
vvhg1 Oct 9, 2023
9b608da
Merge branch 'ggerganov:master' into master
vvhg1 Oct 10, 2023
8eef958
Merge branch 'ggerganov:master' into master
vvhg1 Oct 14, 2023
ee652b2
process escapes for neg prompt and interactive consec prompts
vvhg1 Oct 14, 2023
52a7767
Merge branch 'ggerganov:master' into master
vvhg1 Oct 14, 2023
02ac367
removed unneccessary static string escape
vvhg1 Oct 19, 2023
f0d3971
Merge branch 'master' of github.com:vvhg1/llama.cpp
vvhg1 Oct 19, 2023
97d67e8
Merge branch 'master' of github.com:ggerganov/llama.cpp
vvhg1 Oct 19, 2023
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
1 change: 1 addition & 0 deletions common/common.cpp
Original file line number Diff line number Diff line change
Expand Up @@ -631,6 +631,7 @@ bool gpt_params_parse(int argc, char ** argv, gpt_params & params) {
process_escapes(params.prompt);
process_escapes(params.input_prefix);
process_escapes(params.input_suffix);
process_escapes(sparams.cfg_negative_prompt);
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Keep only this line from this PR and we can merge.
All other changes are not necessary

for (auto & antiprompt : params.antiprompt) {
process_escapes(antiprompt);
}
Expand Down
3 changes: 3 additions & 0 deletions examples/main/main.cpp
Original file line number Diff line number Diff line change
Expand Up @@ -767,6 +767,9 @@ int main(int argc, char ** argv) {
n_consumed = embd_inp.size();
embd_inp.insert(embd_inp.end(), inp_pfx.begin(), inp_pfx.end());
}
if (params.escape) {
process_escapes(buffer);
}
Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Do we not also want to keep this to escape the new input?

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I don't think we want to - the model will not generate unescaped stuff

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Would that not result in a situation where we escape the initial prompt but not the following prompts in interactive mode?

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@vvhg1
That's the current behavior, user input in interactive mode tokenizes "\n" as \,n

While enabling escapes for interactive user input would allow "multiline" input via "\n", it would prevent user from writing "\n" literally, making it impossible to do things like copy-pasting code snippets etc.

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@staviq I see your point. What I don't yet understand is, why would we want to treat the initial prompt differently than subsequential prompts?

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Ah, wait - I think I was wrong. Let me check this again later, but we might want to keep this escape

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Ok, I agree we should keep this escape as well

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

ok, should all be ready to merge


const auto line_pfx = ::llama_tokenize(ctx, params.input_prefix, false, true);
const auto line_inp = ::llama_tokenize(ctx, buffer, false, false);
Expand Down