Lookahead Optimizer: k steps forward, 1step back for MXNet
-
Updated
Mar 17, 2020 - Python
Lookahead Optimizer: k steps forward, 1step back for MXNet
some tricks for sentence representation
Lookahead Optimizer: k steps forward, 1step back for MXNet
Keras Implementation of the Lookahead Optimizer
A collection of deep learning models (PyTorch implemtation)
Quasi Hyperbolic Rectified DEMON Adam/Amsgrad with AdaMod, Gradient Centralization, Lookahead, iterative averaging and decorrelated Weight Decay
optimizer & lr scheduler & loss function collections in PyTorch
🛠 Toolbox to extend PyTorch functionalities
Add a description, image, and links to the lookahead topic page so that developers can more easily learn about it.
To associate your repository with the lookahead topic, visit your repo's landing page and select "manage topics."