-
Notifications
You must be signed in to change notification settings - Fork 6.5k
Closed
Labels
bugSomething isn't workingSomething isn't working
Description
Describe the bug
A RuntimeError occurs when using the following combination:
- SD3
- Batch inference (
num_images_per_prompt > 1) - LyCORIS
skip_guidance_layersis set
The error message is: "RuntimeError: The size of tensor a (2) must match the size of tensor b (4) at non-singleton dimension 0"
It seems that batch inference (num_images_per_prompt > 1) does not work in conjunction with skip_guidance_layers.
Reproduction
This code snippet produces the error:
self.pipe = StableDiffusion3Pipeline.from_pretrained(
"stabilityai/stable-diffusion-3-medium-diffusers",
torch_dtype=torch.bfloat16
)
self.pipe.scheduler = FlowMatchEulerDiscreteScheduler.from_config(
self.pipe.scheduler.config,
timestep_spacing="trailing",
shift=3.0
)
self.pipe.to("cuda")
lora_scale = 1.0
wrapper, _ = create_lycoris_from_weights(lora_scale, my_lora, self.pipe.transformer)
wrapper.merge_to()
image = self.pipe(
prompt=request.prompt,
num_inference_steps=request.num_inference_steps,
num_images_per_prompt=2, # Batch inference
output_type="pil",
generator=torch.Generator(device="cuda").manual_seed(42),
guidance_scale=request.guidance_scale,
width=request.width,
height=request.height,
skip_guidance_layers=[7, 8, 9], # Doesn't seem to work with batching
).images[0]Commenting out skip_guidance_layers resolves the error.
Expected behavior
Batch inference should work correctly even when skip_guidance_layers is used with LyCORIS.
Logs
No response
System Info
Environment
- CUDA Version: 12.4
- Python version: 3.12.1 (main, Jan 11 2024, 10:22:40) [GCC 10.2.1 20210110]
- Diffusers version: https://github.com/huggingface/diffusers.git@99c0483b67427de467f11aa35d54678fd36a7ea2
- The specific LyCORIS model and inference method used from Bghira: https://huggingface.co/bghira/sd35m-photo-mixedres-cL-sS3-noOverride?not-for-all-audiences=true
Who can help?
Metadata
Metadata
Assignees
Labels
bugSomething isn't workingSomething isn't working