-
Hi, I have been unlucky so far. Got the newest pull, tried a lot of different settings and got CUDA OOM all the times. Before I spend a night training on CPU only, has anyone been successful on a 8GB GPU and if so how? |
Beta Was this translation helpful? Give feedback.
Replies: 4 comments 4 replies
-
I think the minimum vram possible would be running with xformers, 8adam, gradient checkpointing, and (No) latent caching enabled. Though, the lowest I've ever heard is about 9gb. |
Beta Was this translation helpful? Give feedback.
-
If you have Linux you can try deepspeed to get below 8GB. On Windows I see the lowest reported in around 9.x GB. |
Beta Was this translation helpful? Give feedback.
-
With LORA have done it on 6GB, had to disable the full checkpoint saves (via editing the code), disable saves of images (also via a code edit), enable 8bit, enable LORA, used an efficient attention (not xformers, the other option), and disable text encoder training. |
Beta Was this translation helpful? Give feedback.
I think the minimum vram possible would be running with xformers, 8adam, gradient checkpointing, and (No) latent caching enabled. Though, the lowest I've ever heard is about 9gb.