New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
[dist_optim] add distributed functional sgd optimizer #50618
Conversation
[ghstack-poisoned]
[ghstack-poisoned]
[ghstack-poisoned]
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Looks good as it matches the implementation in #45597
@@ -207,3 +207,5 @@ def test_dist_optim(self): | |||
def test_dist_optim_functional(self): | |||
self._test_dist_optim_base(optim.Adagrad, lr=0.05) | |||
self._test_dist_optim_base(optim.Adam, lr=1e-2, amsgrad=True) | |||
self._test_dist_optim_base(optim.SGD, lr=0.05) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Is this test now duplicated with the one on line 203 of this file?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Yes you are right, right now we are defaulting to use the torchscripted version of optimizer, let me merge these two.
Differential Revision: [D25932778](https://our.internmc.facebook.com/intern/diff/D25932778) [ghstack-poisoned]
Differential Revision: [D25932778](https://our.internmc.facebook.com/intern/diff/D25932778) [ghstack-poisoned]
Differential Revision: [D25932778](https://our.internmc.facebook.com/intern/diff/D25932778) [ghstack-poisoned]
Differential Revision: [D25932778](https://our.internmc.facebook.com/intern/diff/D25932778) [ghstack-poisoned]
Stack from ghstack:
Differential Revision: D25932778