Skip to content

Conversation

@sayakpaul
Copy link
Member

@sayakpaul sayakpaul commented Dec 27, 2023

What does this PR do?

Follow up of #6306.

Fixes #6351

@HuggingFaceDocBuilderDev

The docs for this PR live here. All of your documentation changes will be reflected on that endpoint. The docs are available until 30 days after the last update.

Copy link
Contributor

@younesbelkada younesbelkada left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Thanks Sayak ! Can you also replace this line: https://github.com/huggingface/diffusers/blob/main/examples/consistency_distillation/train_lcm_distill_lora_sdxl.py#L1187 with accelerator.unwrap_model(unet).disable_adapters() in order to fix #6351 at the same time? 🙏

@sayakpaul
Copy link
Member Author

@younesbelkada done. Would appreciate another thorough review.

Copy link
Contributor

@younesbelkada younesbelkada left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Thanks very much Sayak!

@sayakpaul sayakpaul merged commit 1ac07d8 into main Dec 28, 2023
donhardman pushed a commit to donhardman/diffusers that referenced this pull request Dec 29, 2023
* add to dreambooth lora.

* add: t2i lora.

* add: sdxl t2i lora.

* style

* lcm lora sdxl.

* unwrap

* fix: enable_adapters().
antoine-scenario pushed a commit to antoine-scenario/diffusers that referenced this pull request Jan 2, 2024
* add to dreambooth lora.

* add: t2i lora.

* add: sdxl t2i lora.

* style

* lcm lora sdxl.

* unwrap

* fix: enable_adapters().
@sayakpaul sayakpaul deleted the make-peft-sd-work-non-peft-others branch March 11, 2024 03:00
AmericanPresidentJimmyCarter pushed a commit to AmericanPresidentJimmyCarter/diffusers that referenced this pull request Apr 26, 2024
* add to dreambooth lora.

* add: t2i lora.

* add: sdxl t2i lora.

* style

* lcm lora sdxl.

* unwrap

* fix: enable_adapters().
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

when i run train_lcm_distill_lora_sdxl.py, it cause error:'DistributedDataParallel' object has no attribute 'disable_adapters'

4 participants