Skip to content
RAdam implemented in Keras & TensorFlow
Python Shell
Branch: master
Clone or download
Latest commit 5bda32f Sep 24, 2019
Permalink
Type Name Latest commit message Commit time
Failed to load latest commit information.
.github Initial commit Aug 16, 2019
keras_radam Fix typo Sep 24, 2019
tests Compatible with Keras 2.3.0 Sep 18, 2019
.gitignore Initial commit Aug 16, 2019
.travis.yml Update travis configuration Sep 23, 2019
CHANGELOG.md Initial commit Aug 16, 2019
LICENSE Initial commit Aug 16, 2019
MANIFEST.in Initial commit Aug 16, 2019
README.md Compatible with Keras 2.3.0 Sep 18, 2019
README.zh-CN.md Compatible with Keras 2.3.0 Sep 18, 2019
publish.sh Initial commit Aug 16, 2019
requirements-dev.txt Add tests with official model Aug 16, 2019
requirements.txt Initial commit Aug 16, 2019
setup.py Update travis configuration Sep 23, 2019
test.sh #2 Fix name of learning rate Aug 16, 2019

README.md

Keras RAdam

Travis Coverage Version Downloads License

[中文|English]

Unofficial implementation of RAdam in Keras and TensorFlow.

Install

pip install keras-rectified-adam

External Link

Usage

import keras
import numpy as np
from keras_radam import RAdam

# Build toy model with RAdam optimizer
model = keras.models.Sequential()
model.add(keras.layers.Dense(input_shape=(17,), units=3))
model.compile(RAdam(), loss='mse')

# Generate toy data
x = np.random.standard_normal((4096 * 30, 17))
w = np.random.standard_normal((17, 3))
y = np.dot(x, w)

# Fit
model.fit(x, y, epochs=5)

TensorFlow without Keras

from keras_radam.training import RAdamOptimizer

RAdamOptimizer(learning_rate=1e-3)

Use Warmup

from keras_radam import RAdam

RAdam(total_steps=10000, warmup_proportion=0.1, min_lr=1e-5)

Q & A

About Correctness

The optimizer produces similar losses and weights to the official optimizer after 500 steps.

Use tf.keras or tf-2.0

Add TF_KERAS=1 to environment variables to use tensorflow.python.keras.

Use theano Backend

Add KERAS_BACKEND=theano to environment variables to enable theano backend.

You can’t perform that action at this time.