Skip to content
This repository has been archived by the owner on Mar 23, 2024. It is now read-only.

Error when load Lora #34

Open
ChowMein47 opened this issue May 17, 2023 · 6 comments
Open

Error when load Lora #34

ChowMein47 opened this issue May 17, 2023 · 6 comments

Comments

@ChowMein47
Copy link

Lora already installed but i can't use it. plz help me!

locon load lora method
loading Lora /content/cagliostro-colab-ui/models/Lora/koreanDollLikeness_v20.safetensors: AttributeError
Traceback (most recent call last):
File "/content/cagliostro-colab-ui/extensions/a1111-sd-webui-locon/scripts/../../../extensions-builtin/Lora/lora.py", line 222, in load_loras
lora = load_lora(name, lora_on_disk.filename)
File "/content/cagliostro-colab-ui/extensions/a1111-sd-webui-locon/scripts/main.py", line 370, in load_lora
is_sd2 = 'model_transformer_resblocks' in shared.sd_model.lora_layer_mapping
File "/usr/local/lib/python3.10/dist-packages/torch/nn/modules/module.py", line 1614, in getattr
raise AttributeError("'{}' object has no attribute '{}'".format(
AttributeError: 'LatentDiffusion' object has no attribute 'lora_layer_mapping'

@Linaqruf
Copy link
Owner

what model name and what lora link?

@ChowMein47
Copy link
Author

ChowMein47 commented May 17, 2023

@Linaqruf this is the link. I tried to use other Lora from Naonaovn & Luxluna but i still facing the same issue

https://huggingface.co/Kanbara/doll-likeness-series/resolve/main/koreanDollLikeness_v20.safetensors

@Linaqruf
Copy link
Owner

image
the lora is loaded successfully with chilloutmix-ni as a model
I think the problem is

  • Your model isn't SDv1.x model because KoreanDollLikeness was based on that
  • The LoRA is corrupted after downloaded
  • This probably happens in repo_type = AUTOMATIC1111, and AUTOMATIC1111-Dev

image

@PhoenixEugene
Copy link

PhoenixEugene commented May 22, 2023

@Linaqruf Hello there. Since apparently I'm now facing a similar (or the same?) issue @YellowStar11 faced a few days ago, I chose to write this here instead of creating a new issue. (Whatever this is, it's definitely weird, because I used the Colab with the A1111 repo just yesterday and the day before and had no problems loading LoRAs on either occasion)

2023-05-21 21_40_48-cagliostro-colab-ui ipynb - Colaboratory - Brave

You can see in the screenshot that something isn't right. After I got that error the first time, I hit the "Disconnect and delete runtime" button and tried everything again from scratch, and still the problem persisted. As for your possible explanations above:

  1. I only use SDv1.5 models, I've never tried or even wanted to use SDv2 models before. In this instance, I believe the model was Galena DEDUX (NSFW)
  2. The LoRA "add_detail" wasn't the only LoRA that didn't want to load. All of them didn't. All 16 of them. So I'm not so sure about the chances of ALL LoRA files getting corrupted after being downloaded, TWICE.
  3. Yes, as I mentioned before, it was the A1111 repo.

This leads me to believe that it's something with the A1111 repo, right? Then again, it worked fine just yesterday, what could've possibly changed? And if it is a problem with the repo, is there a fix?

@Linaqruf
Copy link
Owner

Linaqruf commented May 22, 2023

Hi, now I understand the problem. The problem is as follows:

That's why the LoRA can't be loaded. This is also mentioned in the LoCon extension repository

For a temporary fix, don't set update extensions to True. I pushed a commit last night to set it to False by default. Also, avoid using AUTOMATIC1111-Dev as it's a temporary fix. If you're looking for a long-term solution, here are some options:

  1. If it's LoCon without cp_decomposition, try loading it in sd-webui-additional-networks.
  2. I have also installed a1111-sd-webui-lycoris, which is a continuation of a1111-sd-webui-locon. If the same error occurs again:
    • You can change the prefix for your LoRA from <lora:... to <lyco:.....
    • Or load your LoCon from the Lycoris tab in the extra network. I centralized all LoRA paths to the folder /models/LoRA, so you can load LoRA implementations from Kohya, Lycoris, or Automatic1111 from there.

I also plan to delete a1111-sd-webui-locon after all the problems are resolved. I know it's complicated, but because there are no universal rules to implement LoRA, people now make their own implementations, resulting in the same LoRA being different due to different algorithms.

Thank you.

@PhoenixEugene
Copy link

PhoenixEugene commented May 22, 2023

Thank you so much for the thorough explanation! I'll be trying today with "update extensions" set to false as you suggested. Again, thanks!

Sign up for free to subscribe to this conversation on GitHub. Already have an account? Sign in.
Labels
None yet
Projects
None yet
Development

No branches or pull requests

3 participants