You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
It uses named_modules which will find any descendants, either immediate children or subchildren, But then it sets the replacement LoraInjectedLinear on the CrossAttention block directly
Because of the above, the to_outLinear is not replaced correctly. Instead, an (unused) LoraInjectedLinear model with the name of to_out.0 is set on the CrossAttention block.
You can tell by looking at the module names on the CrossAttention block after patching.
Before patching:
to_q
to_k
to_v
to_out
After patching:
to_q
to_k
to_v
to_out
to_out.0
The text was updated successfully, but these errors were encountered:
monkeypatch_lora
finds CrossAttention blocks and looks for any LinearsIt uses
named_modules
which will find any descendants, either immediate children or subchildren, But then it sets the replacementLoraInjectedLinear
on the CrossAttention block directlyCrossAttention has a ModuleList for to_out that contains a Linear.
Because of the above, the
to_out
Linear
is not replaced correctly. Instead, an (unused)LoraInjectedLinear
model with the name ofto_out.0
is set on the CrossAttention block.You can tell by looking at the module names on the CrossAttention block after patching.
Before patching:
After patching:
The text was updated successfully, but these errors were encountered: