Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

use APEX DistributedDataParallel if use_amp is None #881

Closed
ShoufaChen opened this issue Sep 26, 2021 · 3 comments · Fixed by #882
Closed

use APEX DistributedDataParallel if use_amp is None #881

ShoufaChen opened this issue Sep 26, 2021 · 3 comments · Fixed by #882

Comments

@ShoufaChen
Copy link
Contributor

https://github.com/rwightman/pytorch-image-models/blob/3d9c23af879283e80c2c208786d5613351ca040b/train.py#L454

Hi, I found that if we don't activate amp but the apex is installed in our env, it will choose the ApexDDP by default.

I was wondering is it a best choice or did I miss something?

Thanks in advance.

@rwightman
Copy link
Collaborator

@ShoufaChen you are right, that's not the best choice these days, I almost always use amp (which defaults to native) so this isn't an issue, I'll make a note to fix

@ShoufaChen ShoufaChen mentioned this issue Sep 26, 2021
@ShoufaChen
Copy link
Contributor Author

Hi, @rwightman

Thanks for your reply. I pull a request #882 for this issue.

@rwightman
Copy link
Collaborator

@ShoufaChen thanks, merged

shachargluska pushed a commit to shachargluska/pytorch-image-models that referenced this issue Sep 30, 2021
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging a pull request may close this issue.

2 participants