Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Enable xformers by default #1343

Closed
chinoll opened this issue Nov 19, 2022 · 6 comments
Closed

Enable xformers by default #1343

chinoll opened this issue Nov 19, 2022 · 6 comments
Assignees

Comments

@chinoll
Copy link
Contributor

chinoll commented Nov 19, 2022

Is your feature request related to a problem? Please describe.
Diffusers are too slow to train in the default case.

Describe the solution you'd like
If xformers is installed in the environment, xformers is enabled by default.

Describe alternatives you've considered
A clear and concise description of any alternative solutions or features you've considered.

Additional context
Add any other context or screenshots about the feature request here.

@camenduru
Copy link
Contributor

@chinoll good idea 🤩 I am doing like this with colab maybe helps

!pip install -qq https://github.com/camenduru/stable-diffusion-webui-colab/releases/download/0.0.14/xformers-0.0.14.dev0-cp37-cp37m-linux_x86_64.whl
!git clone https://github.com/huggingface/diffusers.git
!sed -i -e 's/if self._use_memory_efficient_attention_xformers:/if True:/g' /content/diffusers/src/diffusers/models/attention.py
!pip install -qq /content/diffusers

@patrickvonplaten
Copy link
Contributor

I'm actually not opposed to enabling it by default in case it is correctly installed. So ok for me to maybe wrap enable_xformers_.. into a try ... except statement. cc @patil-suraj what do you think?

@patil-suraj
Copy link
Contributor

Good idea, will open a for this soon :)

@patil-suraj
Copy link
Contributor

Fixed in #1354

@jtoy
Copy link

jtoy commented Jan 17, 2023

what is the safe way to know if its enabled from code?

@patil-suraj
Copy link
Contributor

Hey @jtoy , xformers attention is not enabled by default anymore see #1640, we need to call pipeline.enable_xformers_memory_efficient_attention() explicitly to enable it. Also right now there's no direct way to check if it's enabled.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

5 participants