Skip to content

Commit

Permalink
Register all the parameters in the optimizer (pytorch#1455)
Browse files Browse the repository at this point in the history
The code at the end registers only the parameters from `model.fc` in the optimizer, although the text underneath says: "Notice although we register all the parameters in the optimizer, the only parameters that are computing gradients (and hence updated in gradient descent) are the weights and bias of the classifier."

To be consistent with this explanation, we should be adding all the parameters from the model.
  • Loading branch information
ageron committed Apr 5, 2021
1 parent 6fb4757 commit 9a43c95
Showing 1 changed file with 1 addition and 1 deletion.
2 changes: 1 addition & 1 deletion beginner_source/blitz/autograd_tutorial.py
Original file line number Diff line number Diff line change
Expand Up @@ -299,7 +299,7 @@
# The only parameters that compute gradients are the weights and bias of ``model.fc``.

# Optimize only the classifier
optimizer = optim.SGD(model.fc.parameters(), lr=1e-2, momentum=0.9)
optimizer = optim.SGD(model.parameters(), lr=1e-2, momentum=0.9)

##########################################################################
# Notice although we register all the parameters in the optimizer,
Expand Down

0 comments on commit 9a43c95

Please sign in to comment.