Skip to content

Conversation

@bitRAKE
Copy link
Contributor

@bitRAKE bitRAKE commented Mar 17, 2023

https://github.com/ggerganov/llama.cpp/blob/721311070e31464ac12bef9a4444093eb3eaebf7/main.cpp#L980-L983
This can fail to colorize the last params.n_batch part of the prompt correctly because embd was just loaded with those tokens and not printed, yet.

@gjmulder gjmulder added the bug Something isn't working label Mar 17, 2023
@sw
Copy link
Contributor

sw commented Mar 19, 2023

Can confirm the bug and the PR fixes it and is certainly cleaner. Can you resolve the conflicts and maybe take along the one-liner in #283, also related to colors? Thanks.

@ggerganov
Copy link
Member

@sw merge if you approve it

@sw sw merged commit 5c19c70 into ggml-org:master Mar 19, 2023
dmahurin pushed a commit to dmahurin/llama.cpp that referenced this pull request May 31, 2023
dmahurin pushed a commit to dmahurin/llama.cpp that referenced this pull request May 31, 2023
dmahurin pushed a commit to dmahurin/llama.cpp that referenced this pull request Jun 1, 2023
Deadsg pushed a commit to Deadsg/llama.cpp that referenced this pull request Dec 19, 2023
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

bug Something isn't working

Projects

None yet

Development

Successfully merging this pull request may close these issues.

4 participants