You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
I changed my mind, after accidentally writing log_sigmoid(x) = -tf.nn.softplus(x) instead of the correct -tf.nn.softplus(-x). We should have tf.log_sigmoid, though it should be implemented as pure Python.
This is a numerically stable version of tf.log(tf.sigmoid(x)). It's just
-tf.nn.softplus(-x), but it's easy to add and the identity is easy to mistype.
RELNOTES: Add tf.log_sigmoid(x) = tf.log(tf.sigmoid(x)) = -tf.nn.softplus(-x).
Fixestensorflow#3719.
Change: 154308666
It would be nice to have a numerically stable log_sigmoid similar to the existing log_softmax.
The text was updated successfully, but these errors were encountered: