Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

about LCM onnx model #1497

Closed
Amin456789 opened this issue Oct 31, 2023 · 6 comments
Closed

about LCM onnx model #1497

Amin456789 opened this issue Oct 31, 2023 · 6 comments
Labels
bug Something isn't working

Comments

@Amin456789
Copy link

Amin456789 commented Oct 31, 2023

Hi!

can someone please tell how we can use the LCM model in onnx? i see u guys made an script to run it in onnx, but what about the model? can we simply use the normal stable diffusion script onnx conversation for lcm model too? or we have to wait someone make an conversation script?

or could someone upload onnx converted of LCM model on huggingface and share it with us please?

kind regards

Who can help?

@echarlaix

@echarlaix
Copy link
Collaborator

echarlaix commented Oct 31, 2023

Hi @Amin456789,

To export latent consistency models you can either use the CLI :

optimum-cli export onnx --model SimianLuo/LCM_Dreamshaper_v7 lcm_onnx/

or directly load the model with ORTLatentConsistencyModelPipeline and convert it to ONNX on-the-fly by setting export=True

from optimum.onnxruntime import ORTLatentConsistencyModelPipeline

pipe = ORTLatentConsistencyModelPipeline.from_pretrained("SimianLuo/LCM_Dreamshaper_v7", export=True)
prompt = "sailing ship in storm by Leonardo da Vinci"
images = pipe(prompt=prompt, num_inference_steps=4, guidance_scale=8.0).images

Also LCMs support was enabled in #1469 so you'll need to install optimum from source for now

@Amin456789
Copy link
Author

thank u so much!

@Amin456789
Copy link
Author

Amin456789 commented Oct 31, 2023

@echarlaix another question, does this version of LCM run on CPU too? can this onnx model be fp16 or quantized to int8?

@nbertagnolli
Copy link

Thanks for working on the support for latent consistency models! : ). When I run the above code I get the following error:

ValueError: Pipeline <class 'diffusers.pipelines.stable_diffusion.pipeline_stable_diffusion.StableDiffusionPipeline'> expected {'unet', 'safety_checker', 'scheduler', 'text_encoder', 'tokenizer', 'vae', 'feature_extractor'}, but only {'unet', 'safety_checker', 'text_encoder', 'tokenizer', 'vae', 'feature_extractor'} were passed.

It looks like it can't find the scheduler. I've installed everything from source and when I load in the model using vanilla transformers I see that it has an associated scheulder:


from diffusers import DiffusionPipeline

pipe = DiffusionPipeline.from_pretrained("SimianLuo/LCM_Dreamshaper_v7", custom_pipeline="latent_consistency_txt2img", custom_revision="main")
print(pipe.scheduler)

I get:

LCMScheduler {
  "_class_name": "LCMScheduler",
  "_diffusers_version": "0.22.0.dev0",
  "beta_end": 0.012,
  "beta_schedule": "scaled_linear",
  "beta_start": 0.00085,
  "clip_sample": true,
  "clip_sample_range": 1.0,
  "dynamic_thresholding_ratio": 0.995,
  "num_train_timesteps": 1000,
  "prediction_type": "epsilon",
  "rescale_betas_zero_snr": false,
  "sample_max_value": 1.0,
  "set_alpha_to_one": true,
  "steps_offset": 0,
  "thresholding": false,
  "timestep_spacing": "leading",
  "trained_betas": null
}

Any thoughts on why the ORTLatentConsistencyModelPipeline can't find the scheduler in this case?

@NeusZimmer
Copy link

NeusZimmer commented Dec 28, 2023

@echarlaix another question, does this version of LCM run on CPU too? can this onnx model be fp16 or quantized to int8?

I've converted it to fp16 with no issues and run ok (also on CPU) will upload to civitai if allowed by the owner of the model.

@echarlaix
Copy link
Collaborator

@nbertagnolli, do you still have this issue with diffusers v0.22.0 or higher ? If yes could you open a new issue describing the error you are getting and ping me there?

Great news, thanks for sharing @NeusZimmer !

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug Something isn't working
Projects
None yet
Development

No branches or pull requests

4 participants