-
Notifications
You must be signed in to change notification settings - Fork 595
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
How much VRAM do I need to run this on Gradio? #13
Comments
I, too, am curious about PC requirements. |
4070 can run |
Cannot run onRTX4090 with 24GB. Keep getting CUDA out of memory error. |
I'm OOMing as well, is this model just really beefy? Using the defaults, running on linux with Triton etc. installed. |
OK we just found a way to run it on a RTX4090 , you need to drop the "number of sample steps" to 35 and the image dimensions should be dropped too. 1040x640 or 640x640 works.And you can only have up to 5 or 6 steps in the description as well. Anything longer blows the ram. Having said that, right now only "RealVision" is working. |
I have been succesful with a RTX 4090 by enabling VAE slicing and CPU offloading. Add at line 545 of gradio_app_sdxl_specific_id.py: You may need to use an old version of pillow as the one installed by pip may be too recent: |
Thank you that's very helpful! |
We now add a low GPU Memory cost version, it was tested on a machine with 24GB GPU memory (Tesla A10) and 30GB RAM and is expected to work well with >20 G GPU memory. python gradio_app_sdxl_specific_id_low_vram.py |
Any chance we can run this on 3060 12gb vram? |
Any way of running this on a Nvidia GeForce RTX 2070 with 8gb vram? |
I think I should leave this chat having RTX 4050 6GB Laptop GPU 😅 |
RTX 3090/4090 can handle this?
And also, are you plan to release the weights on GIthub?
Thanks in advance!
The text was updated successfully, but these errors were encountered: