Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

How much VRAM do I need to run this on Gradio? #13

Open
davizca opened this issue May 4, 2024 · 11 comments
Open

How much VRAM do I need to run this on Gradio? #13

davizca opened this issue May 4, 2024 · 11 comments

Comments

@davizca
Copy link

davizca commented May 4, 2024

RTX 3090/4090 can handle this?

And also, are you plan to release the weights on GIthub?

Thanks in advance!

@ScotterMonk
Copy link

I, too, am curious about PC requirements.
How will the following do?
AMD 5800X 64GB, RTX 3080 12GB

@smthemex
Copy link

smthemex commented May 5, 2024

4070 can run

@Speedway1
Copy link

Cannot run onRTX4090 with 24GB. Keep getting CUDA out of memory error.

@cryptowooser
Copy link
Contributor

I'm OOMing as well, is this model just really beefy? Using the defaults, running on linux with Triton etc. installed.

@Speedway1
Copy link

OK we just found a way to run it on a RTX4090 , you need to drop the "number of sample steps" to 35 and the image dimensions should be dropped too. 1040x640 or 640x640 works.And you can only have up to 5 or 6 steps in the description as well. Anything longer blows the ram. Having said that, right now only "RealVision" is working.

@deepbeepmeep
Copy link

I have been succesful with a RTX 4090 by enabling VAE slicing and CPU offloading.

Add at line 545 of gradio_app_sdxl_specific_id.py:
[TAB]pipe.enable_vae_slicing()
[TAB]pipe.enable_model_cpu_offload()

You may need to use an old version of pillow as the one installed by pip may be too recent:
pip install pillow==9.5

@Speedway1
Copy link

I have been succesful with a RTX 4090 by enabling VAE slicing and CPU offloading.

Add at line 545 of gradio_app_sdxl_specific_id.py: [TAB]pipe.enable_vae_slicing() [TAB]pipe.enable_model_cpu_offload()

You may need to use an old version of pillow as the one installed by pip may be too recent: pip install pillow==9.5

Thank you that's very helpful!

@Z-YuPeng
Copy link
Collaborator

Z-YuPeng commented May 6, 2024

We now add a low GPU Memory cost version, it was tested on a machine with 24GB GPU memory (Tesla A10) and 30GB RAM and is expected to work well with >20 G GPU memory.

python gradio_app_sdxl_specific_id_low_vram.py

@maxbizz
Copy link

maxbizz commented May 9, 2024

Any chance we can run this on 3060 12gb vram?

@jjhaggar
Copy link

jjhaggar commented May 9, 2024

Any way of running this on a Nvidia GeForce RTX 2070 with 8gb vram?
Maybe lowering the resolution and number of steps? Please, let us know if somebody gets it to work with similar hardware specs :)

@AayushSameerShah
Copy link

I think I should leave this chat having RTX 4050 6GB Laptop GPU 😅

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

10 participants