-
Notifications
You must be signed in to change notification settings - Fork 5k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
[0.28.0]: from_single_file doesn't work with custom checkpoints #8359
Comments
@putdanil can you install diffusers from main and try again? Is it possible for you to host the checkpoint you're trying to load on the Hugging Face hub and share it here? |
@putdanil By any chance is this a merged inpainting checkpoint? |
it is, I'm in the process of uploading it |
@DN6 here is the link dputilov/juggernautxlinpaint |
@putdanil Could you try passing in the config argument when loading the model? pipe = StableDiffusionXLControlNetInpaintPipeline.from_single_file(
"checkpoint_sdxl_generation.safetensors",
torch_dtype=torch.float16,
variant="fp16",
use_safetensors=True,
controlnet=[controlnet_midas_sdxl, controlnet_canny_sdxl],
vae=vae,
config="diffusers/stable-diffusion-xl-1.0-inpainting-0.1"
).to('cuda') |
@DN6 Thanks! Everything works now. |
Describe the bug
Can't load custom SDXL checkpoint with from_single_file with custom checkpoints since 0.28.0 version. All SDXL pipelines don't work, with or without controlnet.
Triedsetting low_cpu_mem_usage=False and ignore_mismatched_sizes=True, it doesn't change anything.
Reproduction
pipe = StableDiffusionXLControlNetInpaintPipeline.from_single_file(
"checkpoint_sdxl_generation.safetensors",
torch_dtype=torch.float16,
variant="fp16",
use_safetensors=True,
controlnet=[controlnet_midas_sdxl, controlnet_canny_sdxl],
vae=vae
).to('cuda')
Logs
System Info
ValueError: Cannot load because down_blocks.1.attentions.0.proj_in.weight expected shape tensor(..., device='meta', size=(640, 640, 1, 1)), but got torch.Size([640, 640]). If you want to instead overwrite randomly initialized weights, please make sure to pass both
low_cpu_mem_usage=False
andignore_mismatched_sizes=True
. For more information, see also: #1619 (comment) as an example.Who can help?
No response
The text was updated successfully, but these errors were encountered: