New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
GPU and batch size? #30
Comments
Hi @pigcv89 |
I follow the default experiment settings (bs=8 with 4 GPU cards),but it looks like each card only took up about 4G memory. Is this because I am wrong about something? Or 'bs=8' means 'bs=8 for each gpu'? |
@pigcv89 |
Thanks for your replay. I close this issue. |
Thanks for your great work!
I noticed that in your paper you mentioned: The model is trained on 4 TITAN-Xp GPUs with batch size 8 for 8 epochs.
However, I train the SEAM on 4 2080Ti GPUs with batch size 8, and find that each card only took up about 4G memory.
So I wonder, are 4×12G GPUs necessary?
Thanks for your reply.
The text was updated successfully, but these errors were encountered: