Replies: 2 comments 1 reply
-
try memory attention : xformers |
Beta Was this translation helpful? Give feedback.
1 reply
-
There was a small mem leak that should be fixed now. |
Beta Was this translation helpful? Give feedback.
0 replies
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
-
Something I've noticed going back between version c589a35 and the newest repo is that, it seems that I need more VRAM now.
With the newest build, I have to disable EMA and caching latents in order to barely squeeze by and get it to train. Whereas before, this was never an issue. I always had both enabled.
What could be the cause of this? Would it be newer diffusion versions or something with the memory attention ( since this option is not present in that build), I have no idea, just curious and trying to understand what's going on.
I am on a 3060 12GB
Windows 10
Beta Was this translation helpful? Give feedback.
All reactions