Keras port of AdaBound Optimizer for PyTorch, from the paper Adaptive Gradient Methods with Dynamic Bound of Learning Rate.
Add the adabound.py
script to your project, and import it. Can be a dropin replacement for Adam
Optimizer.
Also supports AMSBound
variant of the above, equivalent to AMSGrad
from Adam.
from adabound import AdaBound
optm = AdaBound(lr=1e-03,
final_lr=0.1,
gamma=1e-03,
weight_decay=0.,
amsbound=False)
With a wide ResNet 34 and horizontal flips data augmentation, and 100 epochs of training with batchsize 128, it hits 92.16% (called v1).
Weights are available inside the Releases tab
- The smaller ResNet 20 models have been removed as they did not perform as expected and were depending on a flaw during the initial implementation. The ResNet 32 shows the actual performance of this optimizer.
With a small ResNet 20 and width + height data + horizontal flips data augmentation, and 100 epochs of training with batchsize 1024, it hits 89.5% (called v1).
On a small ResNet 20 with only width and height data augmentations, with batchsize 1024 trained for 100 epochs, the model gets close to 86% on the test set (called v3 below).
Currently dependent on Tensorflow backend for tf.clip_by_value
. Will be backend independent after next release of Keras.
- Keras 2.2.4+ & Tensorflow 1.12+ (Only supports TF backend for now).
- Numpy