-
Notifications
You must be signed in to change notification settings - Fork 6.5k
Closed
Labels
bugSomething isn't workingSomething isn't working
Description
Describe the bug
unet model: https://civitai.com/models/50696/qteamix-q
unet config: https://huggingface.co/runwayml/stable-diffusion-v1-5/blob/main/unet/config.json
Following code run correctly in diffusers 0.21.0
import torch
import safetensors
from diffusers import UNet2DConditionModel
from diffusers.pipelines.stable_diffusion.convert_from_ckpt import convert_ldm_unet_checkpoint
from accelerate import init_empty_weights
with init_empty_weights():
unet_config = UNet2DConditionModel.load_config("config.json")
unet = UNet2DConditionModel.from_config(unet_config).to(torch.bfloat16)
state_dict = safetensors.torch.load_file("model.safetensors")
state_dict = convert_ldm_unet_checkpoint(state_dict, unet.config)
but fails in diffusers 0.22.0, Error msg:
Traceback (most recent call last):
File "/home/veturbo-diffusion/a.py", line 15, in <module>
state_dict = convert_ldm_unet_checkpoint(state_dict, unet.config)
File "/root/miniconda3/lib/python3.10/site-packages/diffusers/pipelines/stable_diffusion/convert_from_ckpt.py", line 450, in convert_ldm_unet_checkpoint
new_checkpoint["class_embedding.weight"] = unet_state_dict["label_emb.weight"]
KeyError: 'label_emb.weight'
Fail reason
In diffusers v0.22.0, v0.23.0 it will set new_checkpoint["class_embedding.weight"]
through unet_state_dict["label_emb.weight"] if "num_class_embeds" in unet_config (in convert_from_ckpt.py line 450).
Actually, "num_class_embeds" awlays exist in unet_config, in most time its value is None,
however, "label_emb.weight" sometimes doesn't exist in unet_state_dict. In this case, python will throw KeyError.
Reproduction
import torch
import safetensors
from diffusers import UNet2DConditionModel
from diffusers.pipelines.stable_diffusion.convert_from_ckpt import convert_ldm_unet_checkpoint
from accelerate import init_empty_weights
with init_empty_weights():
unet_config = UNet2DConditionModel.load_config("config.json")
unet = UNet2DConditionModel.from_config(unet_config).to(torch.bfloat16)
state_dict = safetensors.torch.load_file("model.safetensors")
state_dict = convert_ldm_unet_checkpoint(state_dict, unet.config)
Logs
Traceback (most recent call last):
File "/home/veturbo-diffusion/a.py", line 15, in <module>
state_dict = convert_ldm_unet_checkpoint(state_dict, unet.config)
File "/root/miniconda3/lib/python3.10/site-packages/diffusers/pipelines/stable_diffusion/convert_from_ckpt.py", line 450, in convert_ldm_unet_checkpoint
new_checkpoint["class_embedding.weight"] = unet_state_dict["label_emb.weight"]
KeyError: 'label_emb.weight'
### System Info
- `diffusers` version: 0.22.0
- Platform: Linux-5.4.143-2-velinux1-amd64-x86_64-with-glibc2.31
- Python version: 3.10.12
- PyTorch version (GPU?): 1.13.1+cu117 (True)
- Huggingface_hub version: 0.17.3
- Transformers version: 4.35.0
- Accelerate version: 0.24.1
- xFormers version: not installed
- Using GPU in script?: <fill in>
- Using distributed or parallel set-up in script?: <fill in>
### Who can help?
@yiyixuxu @DN6 @sayakpaul @patrickvonplaten
Codle and PaulFidika
Metadata
Metadata
Assignees
Labels
bugSomething isn't workingSomething isn't working