[Questions] Migrate from Main automatic to Vladmandic #223
-
Hey guys, I migrated here, however, as I'm noob, I have some questions. 1- I don't need visual studio for running CUDA properly, right? 2- Is it normal that 3 Gb of vram is busy without doing anything? 3- My Torch version is 2.0.0+cu118 autocast half , does this correctly support the CUDA 12.1 that I have installed? [Has PyTorch linked to exact version of CUDA?] 4- Do I still need to install xFormers? Thanks 🙏 |
Beta Was this translation helpful? Give feedback.
Replies: 1 comment
-
re: 1 - no |
Beta Was this translation helpful? Give feedback.
re: 1 - no
re: 2 - its normal since model is loaded in vram. you can use
--lowvram
to load model to ram and then swap it to vram when needed, but that slows generation down by quite a bitre: 3 - cuda may be 12.1, but torch is compiled against 11.8, so that's what it reports. normally, mixing versions like that is a no-no, but this is a known good combo
re: 4 - your choice, they are fully optional here as default cross-optimization is sdp - its almost identical in performance to xformers, but its built-in so less complications to install