Skip to content

fix(ik-llama-cpp): patch clip.cpp for new ggml_quantize_chunk signature#9531

Merged
mudler merged 1 commit into
masterfrom
fix/ik-llama-cpp-clip-quantize-chunk
Apr 24, 2026
Merged

fix(ik-llama-cpp): patch clip.cpp for new ggml_quantize_chunk signature#9531
mudler merged 1 commit into
masterfrom
fix/ik-llama-cpp-clip-quantize-chunk

Conversation

@mudler
Copy link
Copy Markdown
Owner

@mudler mudler commented Apr 24, 2026

Bumps ik_llama.cpp pin to 16996aeab7. Upstream 286ce32...16996ae adds a trailing const struct quantize_user_data * parameter to ggml_quantize_chunk (PR ikawrakow/ik_llama.cpp#1677) but leaves examples/llava/clip.cpp unchanged because their build has moved to examples/mtmd/. LocalAI's prepare.sh still copies from examples/llava/, so the dead 7-arg call reaches the grpc-server compile and fails. Patch the call site to pass nullptr for the new param.

Assisted-by: Claude:Opus-4.7 [Read] [Edit] [Bash]

Bumps ik_llama.cpp pin to 16996aeab7. Upstream 286ce32...16996ae adds a
trailing `const struct quantize_user_data *` parameter to
`ggml_quantize_chunk` (PR ikawrakow/ik_llama.cpp#1677) but leaves
`examples/llava/clip.cpp` unchanged because their build has moved to
`examples/mtmd/`. LocalAI's prepare.sh still copies from
`examples/llava/`, so the dead 7-arg call reaches the grpc-server
compile and fails. Patch the call site to pass `nullptr` for the new
param.

Assisted-by: Claude:Opus-4.7 [Read] [Edit] [Bash]
@mudler mudler merged commit c0920f3 into master Apr 24, 2026
42 checks passed
@mudler mudler deleted the fix/ik-llama-cpp-clip-quantize-chunk branch April 24, 2026 11:07
@localai-bot localai-bot added the bug Something isn't working label May 9, 2026
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

bug Something isn't working

Projects

None yet

Development

Successfully merging this pull request may close these issues.

2 participants