From 1b5631906c971f5a3cd29c5db8299f1248ed69ea Mon Sep 17 00:00:00 2001 From: Wojciech Rzadkowski Date: Tue, 11 Aug 2020 18:09:47 +0200 Subject: [PATCH] Add optimizers to readme --- README.md | 2 +- 1 file changed, 1 insertion(+), 1 deletion(-) diff --git a/README.md b/README.md index 75adf9002..ddcd7af91 100644 --- a/README.md +++ b/README.md @@ -52,7 +52,7 @@ comes with everything you need to start your research, including: * **Common layers** (`flax.nn`): Dense, Conv, {Batch|Layer|Group} Norm, Attention, Pooling, {LSTM|GRU} Cell, Dropout -* **Optimizers** (`flax.optim`): SGD, Momentum, Adam, LARS +* **Optimizers** (`flax.optim`): SGD, Momentum, Adam, LARS, Adagrad, LAMB, RMSprop * **Utilities and patterns**: replicated training, serialization and checkpointing, metrics, prefetching on device