Feature: Add FLUX.2 LOKR model support (detection and loading)#8909
Feature: Add FLUX.2 LOKR model support (detection and loading)#8909lstein merged 4 commits intoinvoke-ai:mainfrom
Conversation
Fix BFL LOKR models being misidentified as AIToolkit format Fix alpha key warning in LOKR QKV split layers Co-authored-by: copilot-swe-agent[bot] <198982749+Copilot@users.noreply.github.com> Co-authored-by: lstein <111189+lstein@users.noreply.github.com>
Pfannkuchensack
left a comment
There was a problem hiding this comment.
Works fine with the 2 loras i tested.
There are some warning in the console:
[2026-02-25 02:18:11,165]::[LayerPatcher]::WARNING --> Failed to find module for LoRA layer key: lora_transformer-double_stream_modulation_img.lin [2026-02-25 02:18:11,166]::[LayerPatcher]::WARNING --> Failed to find module for LoRA layer key: lora_transformer-double_stream_modulation_txt.lin [2026-02-25 02:18:11,166]::[LayerPatcher]::WARNING --> Failed to find module for LoRA layer key: lora_transformer-final_layer.linear [2026-02-25 02:18:11,166]::[LayerPatcher]::WARNING --> Failed to find module for LoRA layer key: lora_transformer-img_in [2026-02-25 02:18:11,341]::[LayerPatcher]::WARNING --> Failed to find module for LoRA layer key: lora_transformer-single_stream_modulation.lin [2026-02-25 02:18:11,341]::[LayerPatcher]::WARNING --> Failed to find module for LoRA layer key: lora_transformer-time_in.in_layer [2026-02-25 02:18:11,342]::[LayerPatcher]::WARNING --> Failed to find module for LoRA layer key: lora_transformer-time_in.out_layer [2026-02-25 02:18:11,342]::[LayerPatcher]::WARNING --> Failed to find module for LoRA layer key: lora_transformer-txt_in
But these messages are also appearing in the flux2 small lora branch.
BFL's FLUX.2 model uses different names than diffusers' Flux2Transformer2DModel for top-level modules (embedders, modulations, output layers). The existing conversion only handled block-level renames (double_blocks→transformer_blocks), causing "Failed to find module" warnings for non-block LoRA keys like img_in, txt_in, modulation.lin, time_in, and final_layer.
|
I just made a commit for the warnings that fixes it. |
|
Thanks! If the code looks good, could you give it an approval? I cannot do it on my own. |
|
Also, |
…e-ai#8909) * Add FLUX.2 LOKR model support (detection and loading) (#88) Fix BFL LOKR models being misidentified as AIToolkit format Fix alpha key warning in LOKR QKV split layers Co-authored-by: copilot-swe-agent[bot] <198982749+Copilot@users.noreply.github.com> Co-authored-by: lstein <111189+lstein@users.noreply.github.com> * Fix BFL→diffusers key mapping for non-block layers in FLUX.2 LoRA/LoKR BFL's FLUX.2 model uses different names than diffusers' Flux2Transformer2DModel for top-level modules (embedders, modulations, output layers). The existing conversion only handled block-level renames (double_blocks→transformer_blocks), causing "Failed to find module" warnings for non-block LoRA keys like img_in, txt_in, modulation.lin, time_in, and final_layer. --------- Co-authored-by: Copilot <198982749+Copilot@users.noreply.github.com> Co-authored-by: lstein <111189+lstein@users.noreply.github.com> Co-authored-by: Alexander Eichhorn <alex@eichhorn.dev>
Summary
This is essentially an update to PR #8862, which added Flux.2 Klein LoRA support. Even after this PR was applied, I have found a number of Flux.2 LoRAs out there which work with Comfy, but are not correctly detected and loaded by InvokeAI. They are a LOKR variant.
While most of these are NSFW LoRAs on Civitai, we should probably support them, and this PR fixes that.
Warning: it was mostly written by copilot and needs code review. I've done functional testing.
Related Issues / Discussions
QA Instructions
Here is a SFW LoRA to test with: flux-2-klein-4b-spritesheet-lora. It should download, be recognized as a Flux.2 LoRA and generate properly.
Also check that other flavors of Flux.2 LoRAs still work.
Merge Plan
Simple merge.
Checklist
What's Newcopy (if doing a release after this PR)