Fix load_lora_weights crash on modular sub-pipelines without transformer#13495
Open
ParamChordiya wants to merge 1 commit intohuggingface:mainfrom
Open
Fix load_lora_weights crash on modular sub-pipelines without transformer#13495ParamChordiya wants to merge 1 commit intohuggingface:mainfrom
ParamChordiya wants to merge 1 commit intohuggingface:mainfrom
Conversation
When calling `load_lora_weights` on a modular sub-pipeline that only contains certain components (e.g., text encoders), the method crashes with `AttributeError` because it unconditionally accesses `self.transformer`. This change uses safe attribute access (`getattr` with default `None`) and skips loading with a warning when the target component is not available on the pipeline. Fixes both `Flux2LoraLoaderMixin` and `FluxLoraLoaderMixin`: - Flux2: warns and returns early if transformer is missing - Flux1: independently skips transformer/text_encoder loading when either is missing, only returns early if both are absent Fixes huggingface#13487
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Add this suggestion to a batch that can be applied as a single commit.This suggestion is invalid because no changes were made to the code.Suggestions cannot be applied while the pull request is closed.Suggestions cannot be applied while viewing a subset of changes.Only one suggestion per line can be applied in a batch.Add this suggestion to a batch that can be applied as a single commit.Applying suggestions on deleted lines is not supported.You must change the existing code in this line in order to create a valid suggestion.Outdated suggestions cannot be applied.This suggestion has been applied or marked resolved.Suggestions cannot be applied from pending reviews.Suggestions cannot be applied on multi-line comments.Suggestions cannot be applied while the pull request is queued to merge.Suggestion cannot be applied right now. Please check back later.
Summary
Fixes #13487
load_lora_weightscrashes withAttributeError: 'Flux2ModularPipeline' object has no attribute 'transformer'when called on a modular sub-pipeline that doesn't have the transformer component (e.g., a text-encoder-only sub-pipeline)load_lora_weightsunconditionally accessesself.transformervia unsafegetattr/hasattrpatternsgetattrwith defaultNone) and skip loading with a warning when the target component is not availableChanges
Flux2LoraLoaderMixin.load_lora_weights: checks if transformer exists; warns and returns early if missing.FluxLoraLoaderMixin.load_lora_weights: independently guards transformer and text_encoder loading, so a sub-pipeline with only one of them still works. Only returns early if both are absent.Test: added
test_load_lora_weights_warns_when_transformer_missingto verify the fix.Before
After
Test plan
load_lora_weightson a fullFlux2Pipelinestill works normallyload_lora_weightson a text-encoder-only modular sub-pipeline warns and skips without crashingload_lora_weightson a denoise-only modular sub-pipeline loads transformer LoRA correctlypytest tests/lora/test_lora_layers_flux2.py