AdamP: Slowing Down the Slowdown for Momentum Optimizers on Scale-invariant Weights (ICLR 2021)
-
Updated
Jan 13, 2021 - Python
AdamP: Slowing Down the Slowdown for Momentum Optimizers on Scale-invariant Weights (ICLR 2021)
A NumPy based Neural Network Package Implementation
En el siguiente repositorio encontrarás material relacionado con optimizadores usados en metodologías de aprendizaje de maquina. In the following repository you'll find examples of optimizers used in machine learning methods
Unofficial implementation of the Adan optimizer with Schedule-Free
Add a description, image, and links to the optimizer-algorithms topic page so that developers can more easily learn about it.
To associate your repository with the optimizer-algorithms topic, visit your repo's landing page and select "manage topics."