Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

--medvram OOMs #42

Closed
oobabooga opened this issue Sep 1, 2022 · 2 comments
Closed

--medvram OOMs #42

oobabooga opened this issue Sep 1, 2022 · 2 comments

Comments

@oobabooga
Copy link
Contributor

oobabooga commented Sep 1, 2022

My video card is GTX 1650 4GB and I am running the script with

python stable-diffusion-webui/webui.py --medvram --precision full --no-half

I have noticed that the new --medvram option is not equivalent to the old --lowvram option. In this new implementation, I am unable to generate txt2img with 512x512 or 448x448 resolutions like I could before.

If you need me to run any specific tests, please let me know.

@AUTOMATIC1111
Copy link
Owner

old --lowvram shouldbe equivalent to --lowvram --always-batch-cond-uncond. Please try that. If --medvram OOMs for you, there's nothing I can do.

@oobabooga
Copy link
Contributor Author

I confirm that --lowvram --always-batch-cond-uncond behaves as before, with the same time per image (was not aware of this new option), and both 448x448 and 512x512 working. So issue resolved, thank you!

Sashimimochi pushed a commit to Sashimimochi/stable-diffusion-webui that referenced this issue Apr 7, 2023
* Update hlky

* Update automatic
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants