New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
How to apply gradient clipping in TensorFlow 2.0? #28707
Comments
@tanzhenyu Is this already supported in keras optimizers v2? |
A simple method to apply gradient clipping in TensorFlow 2.0:
|
Hi, You can clip the gradients as we used to do in tfx1.0
|
There are essentially 2 ways to do this, as mentioned above, so I will just summarize here:
They are both correct, it's just the 2nd option gives you more flexibility. |
If we use import tensorflow as tf
from tensorflow import keras
x = tf.Variable([3.0, 4.0])
y = tf.Variable([1.0, 1.0, 1.0, 1.0])
z = tf.reduce_sum(x ** 2) + tf.reduce_sum(y)
adam = keras.optimizers.Adam(0.01, clipnorm=1.0)
grads = adam.get_gradients(z, [x, y])
sess = tf.Session()
sess.run(tf.global_variables_initializer())
print(sess.run(grads))
# outputs: [0.6, 0.8], [0.5, 0.5, 0.5, 0.5]
# that means the optimizer clip gradients using the clipnorm parameter as a local norm for each Variable |
|
|
If you're using tf-nightly, then tf.compat.v1 everything |
Please make sure that this is a feature request. As per our GitHub Policy, we only address code/doc bugs, performance issues, feature requests and build/installation issues on GitHub. tag:feature_template
System information
Describe the feature and the current behavior/state.
Will this change the current api? How?
Who will benefit with this feature?
Any Other info.
I want to apply gradient clipping in TF 2.0, the best solution is to decorator optimizer with
tf.contrib.estimator.clip_gradients_by_norm
in TF 1.x.However, I can't find this function in TF2.0 after trying many methods. As I know, the tf.contrib has been clean up in TF 2.0
The text was updated successfully, but these errors were encountered: