-
Notifications
You must be signed in to change notification settings - Fork 24.7k
clarify default value of requires_grad for tensors #61038
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Conversation
[ghstack-poisoned]
💊 CI failures summary and remediationsAs of commit d8c670e (more details on the Dr. CI page and at hud.pytorch.org/pr/61038): 💚 💚 Looks good so far! There are no failures yet. 💚 💚 Preview docs built from this PR This comment was automatically generated by Dr. CI (expand for details).Follow this link to opt-out of these comments for your Pull Requests.Please report bugs/suggestions to the (internal) Dr. CI Users group. |
@dagitses has imported this pull request. If you are a Facebook employee, you can view this diff on Phabricator. |
@dagitses has imported this pull request. If you are a Facebook employee, you can view this diff on Phabricator. |
docs/source/notes/autograd.rst
Outdated
@@ -91,6 +91,8 @@ the module level with :meth:`nn.Module.requires_grad_()`. | |||
When applied to a module, ``.requires_grad_()`` takes effect on all | |||
of the module's parameters (which have ``requires_grad=True`` by default). | |||
|
|||
Tensors that are not module parameters have ``requires_grad=False` by default. |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Maybe this should be just after the paragraph line 61 mentioning that all the Tensor factory Functions set it to False by default and it can be changed via the requires_grad
keyword argument.
Also mention that when a Tensor is wrapped inside a nn.Parameter
, then its requires_grad
field is automatically set to True.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
yeah i like that location better too. This is so far down I worry that no one would ever see it.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
what do you think of this?
docs/source/notes/autograd.rst
Outdated
@@ -91,6 +91,8 @@ the module level with :meth:`nn.Module.requires_grad_()`. | |||
When applied to a module, ``.requires_grad_()`` takes effect on all | |||
of the module's parameters (which have ``requires_grad=True`` by default). | |||
|
|||
Tensors that are not module parameters have ``requires_grad=False` by default. | |||
|
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Note that the formatting above is missing the second backquote after the requires_grad
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Thanks, but now moot.
BTW, I see some lines like :meth:nn.Module.requires_grad_()
. Do you know why those are able to get away with single backquotes?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
The ones with a marker before like :meth:
or :attr:
need only single backquotes.
The inline code is marked by the double backquotes.
Differential Revision: [D29491984](https://our.internmc.facebook.com/intern/diff/D29491984) [ghstack-poisoned]
Differential Revision: [D29491984](https://our.internmc.facebook.com/intern/diff/D29491984) [ghstack-poisoned]
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
LGTM
@dagitses has imported this pull request. If you are a Facebook employee, you can view this diff on Phabricator. |
Differential Revision: [D29491984](https://our.internmc.facebook.com/intern/diff/D29491984) [ghstack-poisoned]
@dagitses has imported this pull request. If you are a Facebook employee, you can view this diff on Phabricator. |
Differential Revision: [D29491984](https://our.internmc.facebook.com/intern/diff/D29491984) [ghstack-poisoned]
@dagitses has imported this pull request. If you are a Facebook employee, you can view this diff on Phabricator. |
Differential Revision: [D29491984](https://our.internmc.facebook.com/intern/diff/D29491984) [ghstack-poisoned]
@dagitses has imported this pull request. If you are a Facebook employee, you can view this diff on Phabricator. |
Stack from ghstack:
Differential Revision: D29491984