Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Bad mIoU when using many GPUs #21

Closed
xpngzhng opened this issue Mar 29, 2020 · 2 comments
Closed

Bad mIoU when using many GPUs #21

xpngzhng opened this issue Mar 29, 2020 · 2 comments

Comments

@xpngzhng
Copy link

xpngzhng commented Mar 29, 2020

I use the default deeplabv3plus config to train, and only modify the number of GPUs used. I noticed that the mIoU in the validation set drops significantly when the number of GPUs exceeds 4, as follows:

1 gpu: 0.7729
2 gpus: 0.7750
4 gpus: 0.7478
8 gpus: 0.5373

I guess it is caused by the batch normalization. Maybe sync BN will make a difference.
Things are quite different in object detection, e.g. mmdetection, where basic BN is used. The performance does not vary too much when I change the number of GPUs.

@DarthThomas
Copy link
Contributor

Hi,

Yes, it is the problem with BN. Sync BN would reduce the difference between the different # of GPUs.

@xpngzhng
Copy link
Author

yes, sync bn dramatically improves the mIoU metric

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants