ComfyUI only using 50% of my VRAM? #1043
-
Come explanation for this? With AUTO's it use about 9.5 of VRAM. |
Beta Was this translation helpful? Give feedback.
Replies: 5 comments 7 replies
-
Comfy follows a policy of moving data that is not being actively used by the GPU to RAM to maximize VRAM space. However, if your VRAM is sufficiently large, you can enable the --highvram option to use VRAM exclusively. However, it is important to be mindful that as the workflow becomes longer and involves using various models, there is a possibility of encountering sudden VRAM OOM. |
Beta Was this translation helpful? Give feedback.
-
I've got 16GB of VRAM (on a 2080), and normal mode uses half of it but only while working on an image. I tried --gpu-only (OOM results) and --highvram. The latter works, but is actually over 20x slower (490sec vs 20sec)! In normal mode my system RAM is high and my SSD is busy swapping. I would have expected that to be slower, but it's not; very strange. |
Beta Was this translation helpful? Give feedback.
-
I'm seeing similar results, I have 24GB VRAM (RTX 3090) and 32GB of RAM and with --highvram made no difference in VRAM usage. I'm generating batch of one, 4096X2048. Sometimes I get a result, sometimes just polka dots (maybe that's what I get for trying to do this large an image on non SDXL checkpoints?). This render takes 4-5 minutes for SD 1.5 or SDXL 1.0, but SD 1.4 took twice as long. Using the v1.5 pruned EMA Only checkpoint. Euler with 60 steps. It never went above 8.4GB of VRAM usage. It's using under a GB of RAM (total usage is 21GB out of 32GB RAM). I see only a 15-20% CPU usage, 1-9% GPU usage. I get, "Warning: Ran out of memory when regular VAE decoding, retrying with tiled VAE decoding." for all renders. But, of course, I'm nowhere near out of memory or video memory. If I lower the image to 2048x1024 it only takes 39 seconds to render and I get no out of memory error from the VAE decoder. With SDXL base 1.0 it never used more than 8GB of memory. Note that I'm new to SD and so I don't know what settings would actually impact VRAM usage. If you care this is the prompt I tested with: Note that ComfyUI is detecting my correct configuration: |
Beta Was this translation helpful? Give feedback.
-
Just to add, using SD1.5 fixed it for me. xl threw errors, 2.1 threw errors. |
Beta Was this translation helpful? Give feedback.
-
执行VRAM_Debug时出错: VRAM_Debug.VRAMdebug() 有一个意外的关键字参数“image_passthrough” 文件“I:\comfyui\execution.py”,第 151 行,recursive_execute |
Beta Was this translation helpful? Give feedback.
Yeah OOM. Nothing breaks card physically.
Just add --highvram in .bat file.