Skip to content

Issues: ggerganov/llama.cpp

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Author
Filter by author
Label
Filter by label
Use alt + click/return to exclude labels
or + click/return for logical OR
Projects
Filter by project
Milestones
Filter by milestone
Assignee
Filter by who’s assigned
Sort

Issues list

Recoverable Error Handling enhancement New feature or request
#4385 opened Dec 9, 2023 by martindevans
4 tasks done
Mixtral MOE enhancement New feature or request
#4381 opened Dec 8, 2023 by fakerybakery
llava: batch inference
#4378 opened Dec 8, 2023 by Borobo
High inference time
#4377 opened Dec 8, 2023 by adeelhasan19
How to use batching
#4372 opened Dec 8, 2023 by EricLBuehler
Implementing TokenMonster tokenizer enhancement New feature or request
#4363 opened Dec 7, 2023 by Sovenok-Hacker
Compilation error on Mac M2 bug-unconfirmed
#4361 opened Dec 7, 2023 by leocus
4 tasks done
Possible of implementing mamba ssm enhancement New feature or request help wanted Extra attention is needed
#4353 opened Dec 7, 2023 by tikikun
4 tasks done
Compilation Error bug-unconfirmed
#4346 opened Dec 6, 2023 by mitesh741
Separation of declarations in llama.cpp enhancement New feature or request
#4339 opened Dec 5, 2023 by jmikedupont2
llm_build_context::build_<X>() functions refactor enhancement New feature or request
#4338 opened Dec 5, 2023 by jmikedupont2
[Feature request] Support for "XVERSE" model enhancement New feature or request
#4337 opened Dec 5, 2023 by aspwow
Add GPU support for training enhancement New feature or request
#4336 opened Dec 5, 2023 by gocursor
ProTip! Type g i on any issue or pull request to go back to the issue listing page.