forked from ggml-org/llama.cpp
-
Notifications
You must be signed in to change notification settings - Fork 450
Issues: LostRuins/koboldcpp
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Author
Label
Projects
Milestones
Assignee
Sort
Issues list
New external file explorer feature in 1.88 causes the UI to freeze temporarily.
#1486
opened Apr 16, 2025 by
sais-github
mmproj files cause
ggml_cuda_cpy: unsupported type combination (bf16 to bf16)
#1483
opened Apr 15, 2025 by
TFWol
--cli mode causes koboldcpp to close instantly or experience an error once input is sent.
#1482
opened Apr 14, 2025 by
wildwolf256
GGML_ASSERT(cgraph->n_nodes < cgraph->size) failed - New Version
#1481
opened Apr 14, 2025 by
DerRehberg
Gemma 3 + mmproj + flashattention falls back to CPU decoding when using --quantkv
#1473
opened Apr 8, 2025 by
vlawhern
CUDA Error: an illegal memory access was encountered
bug
Something isn't working
#1469
opened Apr 7, 2025 by
jodleif
1.87: Build fails: error: use of undeclared identifier 'matmul_f32_f32_coopmat_len'
#1460
opened Apr 2, 2025 by
yurivict
[Feature Request] Add a option to load .mmproj files as CPU-only.
#1454
opened Mar 29, 2025 by
Teramanbr
Error loading Qwen2.5-VL-32B-Instruct vision model: OSError [WinError -529697949]
#1451
opened Mar 29, 2025 by
syzu111
[Enhancement] Seperate default paths for Models and Saved Configs
#1450
opened Mar 29, 2025 by
AllesMeins
Access violation, but works if I keep retrying and sometimes garbled output
#1446
opened Mar 27, 2025 by
RationalFragile
feature request allow the usage of the corpo theam when using other modes exammple when in story mode.
#1443
opened Mar 25, 2025 by
mhussaincov93
Previous Next
ProTip!
no:milestone will show everything without a milestone.