Skip to content
This repository has been archived by the owner on Mar 3, 2024. It is now read-only.

Latest commit

 

History

History
44 lines (29 loc) · 1.13 KB

README.md

File metadata and controls

44 lines (29 loc) · 1.13 KB

Keras AdaBound

Travis Coverage

AdaBound optimizer in Keras.

Install

pip install keras-adabound

Usage

Use the optimizer

from keras_adabound import AdaBound

model.compile(optimizer=AdaBound(lr=1e-3, final_lr=0.1), loss=model_loss)

Load with custom objects

from keras_adabound import AdaBound

model = keras.models.load_model(model_path, custom_objects={'AdaBound': AdaBound})

About weight decay

The optimizer does not have an argument named weight_decay (as in the official repo) since it can be done by adding L2 regularizers to weights:

import keras

regularizer = keras.regularizers.l2(WEIGHT_DECAY / 2)
for layer in model.layers:
    for attr in ['kernel_regularizer', 'bias_regularizer']:
        if hasattr(layer, attr) and layer.trainable:
            setattr(layer, attr, regularizer)