-
Notifications
You must be signed in to change notification settings - Fork 117
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
GradClipByValue callback #315
Conversation
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Awesome! This is nice to add and the implementation is quite simple. 👍
doc/callbacks.md
Outdated
|
||
Given the gradient, and a maximum norm value, the callback normalizes the | ||
gradient so that its L2-norm is less than or equal to the given maximum norm | ||
value. |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
It might be useful to add a sentence or two about where and when this is useful. (Same for GradClipByValue
.) 👍
Co-authored-by: Ryan Curtin <ryan@ratml.org>
…hen to use the callback.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Second approval provided automatically after 24 hours. 👍
Writing a recurrent network test case where the gradient explodes, so implemented gradient clipping.