optimizer & lr scheduler & loss function collections in PyTorch
-
Updated
Jun 29, 2024 - Python
optimizer & lr scheduler & loss function collections in PyTorch
🛠 Toolbox to extend PyTorch functionalities
some tricks for sentence representation
A collection of deep learning models (PyTorch implemtation)
Quasi Hyperbolic Rectified DEMON Adam/Amsgrad with AdaMod, Gradient Centralization, Lookahead, iterative averaging and decorrelated Weight Decay
Lookahead Optimizer: k steps forward, 1step back for MXNet
Keras Implementation of the Lookahead Optimizer
Lookahead Optimizer: k steps forward, 1step back for MXNet
Add a description, image, and links to the lookahead topic page so that developers can more easily learn about it.
To associate your repository with the lookahead topic, visit your repo's landing page and select "manage topics."