Skip to content

Feature: Add FLUX.2 LOKR model support (detection and loading)#8909

Merged
lstein merged 4 commits intoinvoke-ai:mainfrom
lstein:feature/lokr-flux2
Feb 27, 2026
Merged

Feature: Add FLUX.2 LOKR model support (detection and loading)#8909
lstein merged 4 commits intoinvoke-ai:mainfrom
lstein:feature/lokr-flux2

Conversation

@lstein
Copy link
Collaborator

@lstein lstein commented Feb 24, 2026

Summary

This is essentially an update to PR #8862, which added Flux.2 Klein LoRA support. Even after this PR was applied, I have found a number of Flux.2 LoRAs out there which work with Comfy, but are not correctly detected and loaded by InvokeAI. They are a LOKR variant.

While most of these are NSFW LoRAs on Civitai, we should probably support them, and this PR fixes that.

Warning: it was mostly written by copilot and needs code review. I've done functional testing.

Related Issues / Discussions

QA Instructions

Here is a SFW LoRA to test with: flux-2-klein-4b-spritesheet-lora. It should download, be recognized as a Flux.2 LoRA and generate properly.

Also check that other flavors of Flux.2 LoRAs still work.

Merge Plan

Simple merge.

Checklist

  • The PR has a short but descriptive title, suitable for a changelog
  • Tests added / updated (if applicable)
  • ❗Changes to a redux slice have a corresponding migration
  • Documentation added / updated (if applicable)
  • Updated What's New copy (if doing a release after this PR)

Fix BFL LOKR models being misidentified as AIToolkit format



Fix alpha key warning in LOKR QKV split layers

Co-authored-by: copilot-swe-agent[bot] <198982749+Copilot@users.noreply.github.com>
Co-authored-by: lstein <111189+lstein@users.noreply.github.com>
@github-actions github-actions bot added python PRs that change python files backend PRs that change backend files python-tests PRs that change python tests labels Feb 24, 2026
Copy link
Collaborator

@Pfannkuchensack Pfannkuchensack left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Works fine with the 2 loras i tested.

There are some warning in the console:

[2026-02-25 02:18:11,165]::[LayerPatcher]::WARNING --> Failed to find module for LoRA layer key: lora_transformer-double_stream_modulation_img.lin [2026-02-25 02:18:11,166]::[LayerPatcher]::WARNING --> Failed to find module for LoRA layer key: lora_transformer-double_stream_modulation_txt.lin [2026-02-25 02:18:11,166]::[LayerPatcher]::WARNING --> Failed to find module for LoRA layer key: lora_transformer-final_layer.linear [2026-02-25 02:18:11,166]::[LayerPatcher]::WARNING --> Failed to find module for LoRA layer key: lora_transformer-img_in [2026-02-25 02:18:11,341]::[LayerPatcher]::WARNING --> Failed to find module for LoRA layer key: lora_transformer-single_stream_modulation.lin [2026-02-25 02:18:11,341]::[LayerPatcher]::WARNING --> Failed to find module for LoRA layer key: lora_transformer-time_in.in_layer [2026-02-25 02:18:11,342]::[LayerPatcher]::WARNING --> Failed to find module for LoRA layer key: lora_transformer-time_in.out_layer [2026-02-25 02:18:11,342]::[LayerPatcher]::WARNING --> Failed to find module for LoRA layer key: lora_transformer-txt_in

But these messages are also appearing in the flux2 small lora branch.

BFL's FLUX.2 model uses different names than diffusers' Flux2Transformer2DModel
for top-level modules (embedders, modulations, output layers). The existing
conversion only handled block-level renames (double_blocks→transformer_blocks),
causing "Failed to find module" warnings for non-block LoRA keys like img_in,
txt_in, modulation.lin, time_in, and final_layer.
@Pfannkuchensack
Copy link
Collaborator

I just made a commit for the warnings that fixes it.

@lstein
Copy link
Collaborator Author

lstein commented Feb 25, 2026

Thanks!

If the code looks good, could you give it an approval? I cannot do it on my own.

@Pfannkuchensack Pfannkuchensack self-requested a review February 25, 2026 16:15
Copy link
Collaborator

@Pfannkuchensack Pfannkuchensack left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Works fine.

@JPPhoto
Copy link
Collaborator

JPPhoto commented Feb 26, 2026

_convert_bfl_layer_patch_to_diffusers() only handles fused QKV when isinstance(layer, LoRALayer), and does not handle LoKR/FullLayer equivalents. That means img_attn.qkv/txt_attn.qkv in a non-LoRALayer patch will fall through and be kept as transformer_blocks.*.img_attn.qkv, which does not exist in diffusers. (invokeai/backend/patches/lora_conversions/flux_bfl_peft_lora_conversion_utils.py:465 and see logic around lines 471–506)

Also, _validate_looks_like_lora() only checks lokr_w1, lokr_w2, hada_w1_a, and hada_w2_a. It doesn't check lokr_w1_a, lokr_w1_b, lokr_w2_a, lokr_w2_b, lokr_t2, hada_w1_b, hada_w2_b, hada_t1, and hada_t2. (invokeai/backend/model_manager/configs/lora.py:516 - the suffix list is around lines 540–553)

@lstein lstein enabled auto-merge (squash) February 27, 2026 00:37
@lstein lstein merged commit dfc66b7 into invoke-ai:main Feb 27, 2026
13 checks passed
@lstein lstein deleted the feature/lokr-flux2 branch February 27, 2026 00:45
lstein added a commit to lstein/InvokeAI that referenced this pull request Mar 2, 2026
…e-ai#8909)

* Add FLUX.2 LOKR model support (detection and loading) (#88)

Fix BFL LOKR models being misidentified as AIToolkit format



Fix alpha key warning in LOKR QKV split layers

Co-authored-by: copilot-swe-agent[bot] <198982749+Copilot@users.noreply.github.com>
Co-authored-by: lstein <111189+lstein@users.noreply.github.com>

* Fix BFL→diffusers key mapping for non-block layers in FLUX.2 LoRA/LoKR

BFL's FLUX.2 model uses different names than diffusers' Flux2Transformer2DModel
for top-level modules (embedders, modulations, output layers). The existing
conversion only handled block-level renames (double_blocks→transformer_blocks),
causing "Failed to find module" warnings for non-block LoRA keys like img_in,
txt_in, modulation.lin, time_in, and final_layer.

---------

Co-authored-by: Copilot <198982749+Copilot@users.noreply.github.com>
Co-authored-by: lstein <111189+lstein@users.noreply.github.com>
Co-authored-by: Alexander Eichhorn <alex@eichhorn.dev>
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

backend PRs that change backend files python PRs that change python files python-tests PRs that change python tests

Projects

None yet

Development

Successfully merging this pull request may close these issues.

4 participants