Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Change default set_to_none=true in zero_grad methods #4438

Merged
merged 1 commit into from Oct 3, 2023

Conversation

Jackmin801
Copy link
Contributor

The default was changed to false in this PR with the intention of matching PyTorch
#2741

However, since then, PyTorch has changed their default to true.
https://pytorch.org/docs/stable/_modules/torch/optim/optimizer.html#Optimizer.zero_grad
pytorch/pytorch#92731

@Jackmin801
Copy link
Contributor Author

Should I go around the code changing all the self.zero_grad(set_to_none=True) to self.zero_grad()?
Personally I think it's ok to just leave them be.

@tjruwase
Copy link
Contributor

tjruwase commented Oct 3, 2023

Should I go around the code changing all the self.zero_grad(set_to_none=True) to self.zero_grad()? Personally I think it's ok to just leave them be.

@Jackmin801, thanks for the PR. Yes, it is okay to leave the existing calls as is.

@tjruwase tjruwase added this pull request to the merge queue Oct 3, 2023
Merged via the queue into microsoft:master with commit 2f73b83 Oct 3, 2023
16 checks passed
@Jackmin801 Jackmin801 deleted the match-pt-zero-grad branch October 3, 2023 20:33
mauryaavinash95 pushed a commit to mauryaavinash95/DeepSpeed that referenced this pull request Oct 9, 2023
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

None yet

2 participants