-
Notifications
You must be signed in to change notification settings - Fork 25.2k
Add randomness case to the autograd notes #78617
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Conversation
I also took this chance to clean a bit the sphinx formatting and reworded a few minor things. [ghstack-poisoned]
🔗 Helpful links
✅ No Failures (0 Pending)As of commit 2a42dc5 (more details on the Dr. CI page): Expand to see more💚 💚 Looks good so far! There are no failures yet. 💚 💚 This comment was automatically generated by Dr. CI (expand for details).Please report bugs/suggestions to the (internal) Dr. CI Users group. |
I also took this chance to clean a bit the sphinx formatting and reworded a few minor things. [ghstack-poisoned]
I also took this chance to clean a bit the sphinx formatting and reworded a few minor things. [ghstack-poisoned]
I also took this chance to clean a bit the sphinx formatting and reworded a few minor things. [ghstack-poisoned]
I also took this chance to clean a bit the sphinx formatting and reworded a few minor things. [ghstack-poisoned]
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Looks mostly good! Have some small comments. Thanks for fixing up the sphinx.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Nice update!
docs/source/notes/autograd.rst
Outdated
#. If the function is concave (at least locally), use the super-gradient with minimum norm (using a similar argument as above). | ||
#. If the function is defined, define the gradient at the current point by continuity (note that :math:`inf` is possible here, for example, :math:`sqrt(0)`). If multiple values are possible, pick one arbitrarily. | ||
#. If the function is not defined (:math:`\sqrt(-1)`, :math:`\log(-1)` or most functions when the input is :math:`nan` for example) then the value used as the gradient is arbitrary (we might also raise an error but that is not guaranteed). Most functions will use :math:`nan` as the gradient, but for performance reasons, some functions will use non-:math:`nan` values (:math:`\log(-1)` for example). | ||
#. If the function is convex (at least locally), use the sub-gradient of minimum norm (it the steepest descent direction). |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
nit: it is
Also why remove the reference?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I thought referencing an exercise in a random book was not too cool
if you think it's necessary, I can look up a proper reference, but also we never give references for anything else, so that reference looked a bit out of context
I also took this chance to clean a bit the sphinx formatting and reworded a few minor things. [ghstack-poisoned]
@albanD this is ready for another review |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
SGTM
@pytorchbot merge |
@pytorchbot successfully started a merge job. Check the current status here |
Hey @lezcano. |
Summary: I also took this chance to clean a bit the sphinx formatting and reworded a few minor things. Pull Request resolved: #78617 Approved by: https://github.com/soulitzer, https://github.com/albanD Test Plan: contbuild & OSS CI, see https://hud.pytorch.org/commit/pytorch/pytorch/a8ea58afeeb4aed902e52478384003505fbbb107 Reviewed By: osalpekar Differential Revision: D37025684 Pulled By: osalpekar fbshipit-source-id: dcbb76366365e05a9c83d70820a531d4c6cc4b3a
Stack from ghstack:
I also took this chance to clean a bit the sphinx formatting and
reworded a few minor things.