Skip to content

[WIP] FIX Make Mixtral LoRA loading work#44478

Merged
BenjaminBossan merged 11 commits intohuggingface:mainfrom
BenjaminBossan:peft-weight-conversion-fixes
Mar 11, 2026
Merged

[WIP] FIX Make Mixtral LoRA loading work#44478
BenjaminBossan merged 11 commits intohuggingface:mainfrom
BenjaminBossan:peft-weight-conversion-fixes

Conversation

@BenjaminBossan
Copy link
Member

@BenjaminBossan BenjaminBossan commented Mar 5, 2026

Required fixes:

  • some code was using unordered data structures, making weight order random
  • adjust alpha to offset increased rank from fusion
  • import functions from PEFT if available

See huggingface/peft#3083.

Required fixes:

- some code was using unordered data structures, making weight order
random
- adjust alpha to offset increased rank from fusion
- import functions from PEFT if available
BenjaminBossan and others added 2 commits March 6, 2026 16:00
This can be used in PEFT to apply weight conversion there without having
to either re-implement the whole weight conversion machinery or having
to call transformer_model.load_adapter. We want to avoid the latter
because there is a lot of PEFT custom logic to weight loading which
would require adjusting for that case.
@BenjaminBossan BenjaminBossan marked this pull request as ready for review March 6, 2026 16:17
@BenjaminBossan
Copy link
Member Author

BenjaminBossan commented Mar 6, 2026

Note: The added test is a @slow test, probably need to invoke some magic command to run those.
Note 2: Actually lots of PEFT tests are failing, unrelated to this PR. For some I could confirm that it's due to transformers v5. It seems like they haven't been checked in a long time.

Copy link
Member

@Cyrilvallez Cyrilvallez left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

All good on my side! Thanks for fixing! We can bump our peft dependency once you have released on your end to simplify everything!

Comment on lines +61 to +62
MIN_PEFT_VERSION = "0.18.0"
IS_PEFT_GE_019 = version.parse(importlib.metadata.version("peft")) >= version.parse("0.19.0")
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@BenjaminBossan you need to protect this, looks like it crashes if peft is not installed

@HuggingFaceDocBuilderDev

The docs for this PR live here. All of your documentation changes will be reflected on that endpoint. The docs are available until 30 days after the last update.

@BenjaminBossan
Copy link
Member Author

Good catch @Cyrilvallez, I fixed this in the latest commit. Failing CI seems to be unrelated.

@BenjaminBossan BenjaminBossan added this pull request to the merge queue Mar 11, 2026
Merged via the queue into huggingface:main with commit 852f785 Mar 11, 2026
28 checks passed
@BenjaminBossan BenjaminBossan deleted the peft-weight-conversion-fixes branch March 11, 2026 17:44
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

3 participants