Skip to content

ValueError: Checkpoint not supported because layer lora_unet_down_blocks_0_downsamplers_0_conv.alpha not supported. #6368

@cjt222

Description

@cjt222

Describe the bug

Hello, I'm loading the lora model from civtai, which includes convolutional layers, cause error. Are diffusers not supported now?

ValueError: Checkpoint not supported because layer lora_unet_down_blocks_0_downsamplers_0_conv.alpha not supported.

Reproduction

from diffusers import StableDiffusionXLPipeline, LCMScheduler
pipe = StableDiffusionXLPipeline.from_pretrained(
base_model_path,
torch_dtype=torch.float16,
add_watermarker=False,
)

pipe.load_lora_weights(lcm_lora_id, weight_name="XL_LoRA_LCM_Sampler2.safetensors")
pipe.scheduler = LCMScheduler.from_config(pipe.scheduler.config)

Logs

No response

System Info

  • diffusers version: 0.25.0.dev0
  • Platform: Linux-5.4.0-48-generic-x86_64-with-glibc2.27
  • Python version: 3.10.13
  • PyTorch version (GPU?): 2.1.0+cu121 (True)
  • Huggingface_hub version: 0.20.1
  • Transformers version: 4.36.2
  • Accelerate version: 0.24.1
  • xFormers version: 0.0.22.post7
  • Using GPU in script?:
  • Using distributed or parallel set-up in script?:

Who can help?

No response

Metadata

Metadata

Assignees

No one assigned

    Labels

    bugSomething isn't workinglorastaleIssues that haven't received updates

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions