Skip to content

VAE configuration issue #6753

@spezialspezial

Description

@spezialspezial

Describe the bug

It seems vae configuration might be broken in recent commits (~7f58a76f). Even though I can verify the correct config contents are read in I still end up with the wrong vae config. Most notably the scale_factor for SDXL models is that of SD1.5 resulting in washed out generations. I suspect the error to be somewhere in create_diffusers_vae_model_from_ldm but not sure at the moment. Just me? Any ideas?

Reproduction

  1. Instantiate SDXL pipe from single file.
  2. Check your vae scaling factor. If it's not 0.13025 there might be an issue

Logs

No response

System Info

diffusers 0.26.0.dev0 7f58a76, Linux, Python 3.10.12 with variadic function support, torch 2.1.2+cu121 + cuda 12.1

Who can help?

@sayakpaul

Metadata

Metadata

Assignees

Labels

bugSomething isn't working

Type

No type

Projects

No projects

Milestone

No milestone

Relationships

None yet

Development

No branches or pull requests

Issue actions