-
Notifications
You must be signed in to change notification settings - Fork 0
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
How much GPU memory is needed? #2
Comments
i used ~20G |
can you offer the complete output from the terminal? |
This is the output I tried on 3090 (24GB):
|
Hi, thank you for your excellent work. I'm facing a similar issue—running out of memory with the A40 GPU, which has 46GB of memory, when executing pdm_pure.py at a 512x512 resolution. Do you have any suggestions? Update: Solved by using xformers: |
still not solved. I have given up. |
i think you need to use effective attn, otherwise it will too costful |
|
I tried 32GB and 24GB GPU to run your demo code but all failed with CUDA out of memory.
The text was updated successfully, but these errors were encountered: