You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Hi there, thanks for sharing this code. I am currently trying to make it work using an NVIDIA GeForce RTX 3060 with 12GB RAM. If I run "run_features_extraction.py" the call "z_enc, _ = sampler.encode_ddim(...)" finishes. However, right afterward when calling "samples_ddim, _ = sampler.sample()" I run into a "RuntimeError: CUDA out of memory" error. Is there some problem or does the model really need that much memory?
I would appreciate any help.
The text was updated successfully, but these errors were encountered:
Hi there, thanks for sharing this code. I am currently trying to make it work using an NVIDIA GeForce RTX 3060 with 12GB RAM. If I run "run_features_extraction.py" the call "z_enc, _ = sampler.encode_ddim(...)" finishes. However, right afterward when calling "samples_ddim, _ = sampler.sample()" I run into a "RuntimeError: CUDA out of memory" error. Is there some problem or does the model really need that much memory?
I would appreciate any help.
The text was updated successfully, but these errors were encountered: