Skip to content

Commit

Permalink
Remove dependency on Tensorflow by fixing clipping bounds suggested in
Browse files Browse the repository at this point in the history
  • Loading branch information
titu1994 committed Mar 11, 2019
1 parent a0344d1 commit 5ce819b
Show file tree
Hide file tree
Showing 2 changed files with 2 additions and 9 deletions.
4 changes: 0 additions & 4 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -48,10 +48,6 @@ Weights are available inside the [Releases tab](https://github.com/titu1994/kera

<img src="https://github.com/titu1994/keras-adabound/blob/master/images/val_loss.PNG?raw=true" height=50% width=100%>

# Issue with clipping

Currently dependent on Tensorflow backend for `tf.clip_by_value`. Will be backend independent after next release of Keras.

# Requirements
- Keras 2.2.4+ & Tensorflow 1.12+ (Only supports TF backend for now).
- Numpy
7 changes: 2 additions & 5 deletions adabound.py
Original file line number Diff line number Diff line change
@@ -1,4 +1,3 @@
import tensorflow as tf
from keras import backend as K
from keras.optimizers import Optimizer

Expand Down Expand Up @@ -100,10 +99,8 @@ def get_updates(self, loss, params):
# Compute the bounds
step_size_p = step_size * K.ones_like(denom)
step_size_p_bound = step_size_p / denom
# TODO: Replace with K.clip after releast of Keras > 2.2.4
bounded_lr_t = m_t * tf.clip_by_value(step_size_p_bound,
lower_bound,
upper_bound)
bounded_lr_t = m_t * K.minimum(K.maximum(step_size_p_bound,
lower_bound), upper_bound)

p_t = p - bounded_lr_t

Expand Down

0 comments on commit 5ce819b

Please sign in to comment.