kr

keras-radam

RAdam implemented in Keras & TensorFlow

Showing:

Popularity

Downloads/wk

0

GitHub Stars

330

Maintenance

Last Commit

3mos ago

Contributors

7

Package

Dependencies

0

License

MIT

Categories

Readme

Keras RAdam

Travis Coverage Version Downloads License

[中文|English]

Unofficial implementation of RAdam in Keras and TensorFlow.

Install

pip install keras-rectified-adam

Usage

import keras
import numpy as np
from keras_radam import RAdam

# Build toy model with RAdam optimizer
model = keras.models.Sequential()
model.add(keras.layers.Dense(input_shape=(17,), units=3))
model.compile(RAdam(), loss='mse')

# Generate toy data
x = np.random.standard_normal((4096 * 30, 17))
w = np.random.standard_normal((17, 3))
y = np.dot(x, w)

# Fit
model.fit(x, y, epochs=5)

TensorFlow without Keras

from keras_radam.training import RAdamOptimizer

RAdamOptimizer(learning_rate=1e-3)

Use Warmup

from keras_radam import RAdam

RAdam(total_steps=10000, warmup_proportion=0.1, min_lr=1e-5)

Q & A

About Correctness

The optimizer produces similar losses and weights to the official optimizer after 500 steps.

Use tf.keras or tf-2.0

Add TF_KERAS=1 to environment variables to use tensorflow.python.keras.

Use theano Backend

Add KERAS_BACKEND=theano to environment variables to enable theano backend.

Rate & Review

Great Documentation0
Easy to Use0
Performant0
Highly Customizable0
Bleeding Edge0
Responsive Maintainers0
Poor Documentation0
Hard to Use0
Slow0
Buggy0
Abandoned0
Unwelcoming Community0
100