Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

'LatentDiffusion' object has no attribute 'lora_layer_mapping' after today git pull stable-diffusion-webui #38

Open
neskweek opened this issue May 14, 2023 · 6 comments

Comments

@neskweek
Copy link

Hello
after regular git pull I now get, on each generation :

Traceback (most recent call last):
  File "E:\SD\stable-diffusion-webui\extensions\a1111-sd-webui-locon\scripts\..\..\..\extensions-builtin/Lora\lora.py", line 218, in load_loras
    lora = load_lora(name, lora_on_disk.filename)
  File "E:\SD\stable-diffusion-webui\extensions\a1111-sd-webui-locon\scripts\main.py", line 370, in load_lora
    is_sd2 = 'model_transformer_resblocks' in shared.sd_model.lora_layer_mapping
  File "E:\SD\stable-diffusion-webui\venv\lib\site-packages\torch\nn\modules\module.py", line 1614, in __getattr__
    raise AttributeError("'{}' object has no attribute '{}'".format(
AttributeError: 'LatentDiffusion' object has no attribute 'lora_layer_mapping' 

running on windows 11 with RTX 3080
COMMANDLINE_ARGS= --xformers --listen

@neskweek neskweek changed the title 'LatentDiffusion' object has no attribute 'lora_layer_mapping' after today git pull from https://github.com/AUTOMATIC1111/stable-diffusion-webui.git 'LatentDiffusion' object has no attribute 'lora_layer_mapping' after today git pull stable-diffusion-webui May 14, 2023
@neskweek neskweek reopened this May 14, 2023
@rangedreign
Copy link

I also get this error, running on windows 10 with gtx 1660 ti.
COMMANDLINE_ARGS= --theme dark --autolaunch --precision full --no-half --no-half-vae --medvram --disable-safe-unpickle --reinstall-torch --xformers --always-batch-cond-uncond

@hoodady
Copy link

hoodady commented May 16, 2023

loading Lora Z:\sd\stable-diffusion-webui\models\Lora\locon_angelina_v1_from_v3_64_32.safetensors: AssertionError ?it/s]
Traceback (most recent call last):
File "Z:\sd\stable-diffusion-webui\extensions-builtin\Lora\lora.py", line 222, in load_loras
lora = load_lora(name, lora_on_disk.filename)
File "Z:\sd\stable-diffusion-webui\extensions-builtin\Lora\lora.py", line 192, in load_lora
assert False, f'Bad Lora layer name: {key_diffusers} - must end in lora_up.weight, lora_down.weight or alpha'
AssertionError: Bad Lora layer name: lora_unet_down_blocks_0_downsamplers_0_conv.lora_mid.weight - must end in lora_up.w

I get this random error? whats going on?

@Kadah
Copy link
Contributor

Kadah commented May 21, 2023

The extension seems to be broken in a few ways with upstream A1111 changes, though I haven't gotten any of them except the following one.

Another breaking change in dev is AUTOMATIC1111/stable-diffusion-webui@39ec4f0, which I've made a RP for: #39

@duhast123
Copy link

the previous version
scripts\main.py
Add these lines:

**def assign_lora_names_to_compvis_modules(sd_model):
lora_layer_mapping = {}

for name, module in shared.sd_model.cond_stage_model.wrapped.named_modules():
    lora_name = name.replace(".", "_")
    lora_layer_mapping[lora_name] = module
    module.lora_layer_name = lora_name

for name, module in shared.sd_model.model.named_modules():
    lora_name = name.replace(".", "_")
    lora_layer_mapping[lora_name] = module
    module.lora_layer_name = lora_name

sd_model.lora_layer_mapping = lora_layer_mapping**

def load_lora(name, filename):
print('locon load lora method')
lora = LoraModule(name)
lora.mtime = os.path.getmtime(filename)

sd = sd_models.read_state_dict(filename)

**if not hasattr(shared.sd_model, 'lora_layer_mapping'):
    assign_lora_names_to_compvis_modules(shared.sd_model)**

@NoppaiKohai
Copy link

I'm getting this error too, anyone figure out how to fix it

@aetherwu
Copy link

aetherwu commented Jun 12, 2023

the previous version scripts\main.py Add these lines:

**def assign_lora_names_to_compvis_modules(sd_model): lora_layer_mapping = {}

for name, module in shared.sd_model.cond_stage_model.wrapped.named_modules():
    lora_name = name.replace(".", "_")
    lora_layer_mapping[lora_name] = module
    module.lora_layer_name = lora_name

for name, module in shared.sd_model.model.named_modules():
    lora_name = name.replace(".", "_")
    lora_layer_mapping[lora_name] = module
    module.lora_layer_name = lora_name

sd_model.lora_layer_mapping = lora_layer_mapping**

def load_lora(name, filename): print('locon load lora method') lora = LoraModule(name) lora.mtime = os.path.getmtime(filename)

sd = sd_models.read_state_dict(filename)

**if not hasattr(shared.sd_model, 'lora_layer_mapping'):
    assign_lora_names_to_compvis_modules(shared.sd_model)**

It works for me.
Let me clear the code for others:

Add one function before 'def load_lora'

def assign_lora_names_to_compvis_modules(sd_model):
    lora_layer_mapping = {}

    for name, module in shared.sd_model.cond_stage_model.wrapped.named_modules():
        lora_name = name.replace(".", "_")
        lora_layer_mapping[lora_name] = module
        module.lora_layer_name = lora_name

    for name, module in shared.sd_model.model.named_modules():
        lora_name = name.replace(".", "_")
        lora_layer_mapping[lora_name] = module
        module.lora_layer_name = lora_name

    sd_model.lora_layer_mapping = lora_layer_mapping

Replace first a few lins o 'def load_lora'

as

def load_lora(name, lora_on_disk):
    print('locon load lora method')
    lora = LoraModule(name, lora_on_disk)
    lora.mtime = os.path.getmtime(lora_on_disk.filename)
    sd = sd_models.read_state_dict(lora_on_disk.filename)

    if not hasattr(shared.sd_model, 'lora_layer_mapping'):
        assign_lora_names_to_compvis_modules(shared.sd_model)

    is_sd2 = True  // I don't really know what this is for anther function ask for it.

The 'lora_on_disk' part is compatible with latest PR:
https://github.com/KohakuBlueleaf/a1111-sd-webui-locon/pull/39/files

The error reported disppeared.

AttributeError: 'LatentDiffusion' object has no attribute 'lora_layer_mapping'

However one new errror emerged:

Failed to match keys when loading Lora models\Lora.....

Magically, it does not effect final result. All LoRA models works fine to me...

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

7 participants