-
Notifications
You must be signed in to change notification settings - Fork 616
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
What is the minimum memory requirment? #43
Comments
Hello, >18GB VRAM is required for single image inference. |
Thank you for your response. Appreciate your efforts in creating this.
Unfortunately being an end user with no developer background, no idea what need to be done. |
@yisol hello, I'm getting this when I tried to run it on a G4dn.2xlarge AWS instance (GPU NVIDIA T4 16 GB GPU memory).
I've tried to use max_split_size_mb with values ranging from 64 to 512 and I still get the same result. |
works minimum with 8 GB VRAM on 4 bit precision with cpu offloading #47 |
@FurkanGozukara but that is using your files and not the original repo |
true original repo uses a lot |
Just don't spam all the github with your posts. I have already seen your post to buy from patreon and I'm not interested. |
Hey! What do you mean by 'your files' and 'original repo'? I have 3060ti with 8gb memory. Want to try this. Thanks! |
here our idm vton app : https://youtu.be/m4pcIeAVQD0 also here how to use on cloud : https://youtu.be/LeHfgq_lAXU |
I have 4060 8GB VRAM + 8GB shared RAM and it fails all the time with CUDA memory error.
Is there any setting which I can modify so it can run on lower requirements. I already tried 360 x 480 image size and same memory issue.
The text was updated successfully, but these errors were encountered: