Skip to content

Commit

Permalink
Update on "Enable fused optimizer for DP"
Browse files Browse the repository at this point in the history
Differential Revision: [D42714482](https://our.internmc.facebook.com/intern/diff/D42714482/)

**NOTE FOR REVIEWERS**: This PR has internal Meta-specific changes or comments, please review them on [Phabricator](https://our.internmc.facebook.com/intern/diff/D42714482/)!

[ghstack-poisoned]
  • Loading branch information
rohan-varma committed Apr 12, 2023
2 parents 0b45637 + fa7af2d commit 6a114e6
Showing 1 changed file with 1 addition and 1 deletion.
2 changes: 1 addition & 1 deletion torch/csrc/distributed/c10d/reducer.cpp
Original file line number Diff line number Diff line change
Expand Up @@ -1466,7 +1466,7 @@ void Reducer::finalize_bucket_dense(Bucket& bucket) {
if (!gradient_as_bucket_view_) {
if (optim_in_backward_) {
// Return early if optimizer has already run.
runGradCallbackForVariable(variable, [&](auto& grad) {return true;});
runGradCallbackForVariable(variable, [&](auto& grad) { return true; });
} else {
RECORD_FUNCTION(
"torch.distributed.ddp.reducer::copy_bucket_to_grad",
Expand Down

0 comments on commit 6a114e6

Please sign in to comment.