We read every piece of feedback, and take your input very seriously.
To see all available qualifiers, see our documentation.
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
I'm trying to work in kaggle notebooks.
File /opt/conda/lib/python3.10/site-packages/lora_diffusion/lora.py:1012, in patch_pipe(pipe, maybe_unet_path, token, r, patch_unet, patch_text, patch_ti, idempotent_token, unet_target_replace_module, text_target_replace_module) 1010 elif maybe_unet_path.endswith(".safetensors"): 1011 safeloras = safe_open(maybe_unet_path, framework="pt", device="cpu") -> 1012 monkeypatch_or_replace_safeloras(pipe, safeloras) 1013 tok_dict = parse_safeloras_embeds(safeloras) 1014 if patch_ti: File /opt/conda/lib/python3.10/site-packages/lora_diffusion/lora.py:809, in monkeypatch_or_replace_safeloras(models, safeloras) 806 print(f"No model provided for {name}, contained in Lora") 807 continue --> 809 monkeypatch_or_replace_lora_extended(model, lora, target, ranks) File /opt/conda/lib/python3.10/site-packages/lora_diffusion/lora.py:784, in monkeypatch_or_replace_lora_extended(model, loras, target_replace_module, r) 781 _tmp.conv.bias = bias 783 # switch the module --> 784 _module._modules[name] = _tmp 786 up_weight = loras.pop(0) 787 down_weight = loras.pop(0) UnboundLocalError: local variable '_tmp' referenced before assignment
This is the function called:
patch_pipe( pipe, os.getcwd()+"/lora/example_loras/lora_illust.safetensors", patch_text=True, patch_ti=True, patch_unet=True, )
The text was updated successfully, but these errors were encountered:
If you can modify the source code this might be helpful. But if not, downgrade the version of diffusers to v0.16.0 where the Attention module uses nn.Linear rather than the LoRACompatibleLinear which causes the issue above.
Attention
nn.Linear
LoRACompatibleLinear
Sorry, something went wrong.
No branches or pull requests
I'm trying to work in kaggle notebooks.
This is the function called:
The text was updated successfully, but these errors were encountered: