Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Required GPU memory? #25

Open
rgga-16 opened this issue May 8, 2023 · 3 comments
Open

Required GPU memory? #25

rgga-16 opened this issue May 8, 2023 · 3 comments

Comments

@rgga-16
Copy link

rgga-16 commented May 8, 2023

What is the minimum amount of GPU memory needed to run this model on inference?

@IeatToilets
Copy link

It depends on your needs and the amount of load. But I'd recommend anything above 4GB GPU memory it's the standard these days add more if you're planning for a very very heavy usage.

@zky001
Copy link

zky001 commented May 8, 2023

yesterday I try set batch_size to 12,used 11.5G GPU memory
1683517370725

@deepak-1530
Copy link

deepak-1530 commented May 9, 2023

I am using Shap-e on a V100 machine. For a batch size of 4 (image to 3D), it is taking around 13GB GPU memory.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

4 participants