Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Question about batch_size, epoch setting #40

Open
guvcolie opened this issue Dec 24, 2017 · 3 comments
Open

Question about batch_size, epoch setting #40

guvcolie opened this issue Dec 24, 2017 · 3 comments

Comments

@guvcolie
Copy link

Hi, I'm trying to train DenseNet-121 on ImageNet dataset, but the result is poor...
Now I wander how to calculate the batch_size in muiti GPUs. You said "It took us 10 days to train 40M densenet for 120 epochs on 4 TITAN X GPUs, with batchsize 128" on issue (https://github.com/liuzhuang13/DenseNet/issues/5), you mean "each GPU use batchsize 128" or "each GPU use 32, sum is 128"?
Thank you!

@liuzhuang13
Copy link
Owner

Thanks and we meant "each GPU use 32, sum is 128".

@Jianf-Wang
Copy link

so... if I set 2 gpus, and batch_size =64, in fact, it is batchsize 128 ?

@liuzhuang13
Copy link
Owner

Actually the "batch_size" in the code means the total batch size. So if you want in total batch size 128 just set batch_size = 128.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

3 participants