Skip to content

Conversation

@CISC
Copy link
Collaborator

@CISC CISC commented Nov 23, 2025

Allow quantizing LoRA at conversion again, but default to F32 (as has been the norm since #8980 inadvertently forced this).

Fixes #17447
Fixes #10671

@CISC CISC merged commit b61de2b into master Nov 24, 2025
10 checks passed
@CISC CISC deleted the cisc/convert-lora-allow-quantize branch November 24, 2025 14:50
@ggerganov
Copy link
Member

Did this change somehow cause some of the CPU CI to start failing:

image

@ggerganov
Copy link
Member

@CISC For some reason this exception is not being caught when running in the CI:

https://github.com/ggml-org/llama.cpp/actions/runs/19645856267/job/56260791630#step:5:6742

Locally it is being caught and the execution proceeds normally. Any ideas?

@CISC
Copy link
Collaborator Author

CISC commented Nov 25, 2025

Did this change somehow cause some of the CPU CI to start failing:

I don't see how, must be something else.

@CISC For some reason this exception is not being caught when running in the CI:

https://github.com/ggml-org/llama.cpp/actions/runs/19645856267/job/56260791630#step:5:6742

Locally it is being caught and the execution proceeds normally. Any ideas?

Strange, I'll have a look...

@ggerganov
Copy link
Member

I traced it that transformers==4.57.2 breaks the conversion and throws the second exception in the log. Downgrading to transformers==4.57.1 resolves the problem.

@CISC
Copy link
Collaborator Author

CISC commented Nov 25, 2025

I traced it that transformers==4.57.2 breaks the conversion and throws the second exception in the log. Downgrading to transformers==4.57.1 resolves the problem.

Yeah, I just found that out too: huggingface/transformers#42378

@CISC
Copy link
Collaborator Author

CISC commented Nov 25, 2025

There will be a hotfix as soon as huggingface/transformers#42389 merges.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

python python script changes

Projects

None yet

Development

Successfully merging this pull request may close these issues.

Misc. bug: convert_lora_to_gguf ignores outtype Misc. bug: convert_lora_to_gguf ignores outtype

4 participants