On the Variance of the Adaptive Learning Rate and Beyond
-
Updated
Jul 31, 2021 - Python
On the Variance of the Adaptive Learning Rate and Beyond
Educational deep learning library in plain Numpy.
Reproducing the paper "PADAM: Closing The Generalization Gap of Adaptive Gradient Methods In Training Deep Neural Networks" for the ICLR 2019 Reproducibility Challenge
Toy implementations of some popular ML optimizers using Python/JAX
A collection of various gradient descent algorithms implemented in Python from scratch
A compressed adaptive optimizer for training large-scale deep learning models using PyTorch
The project aimed to implement Deep NN / RNN based solution in order to develop flexible methods that are able to adaptively fillin, backfill, and predict time-series using a large number of heterogeneous training datasets.
Lookahead optimizer ("Lookahead Optimizer: k steps forward, 1 step back") for tensorflow
[Python] [arXiv/cs] Paper "An Overview of Gradient Descent Optimization Algorithms" by Sebastian Ruder
Short description for quick search
Implemenation of DDPG with numpy only (without Tensorflow)
Nadir: Cutting-edge PyTorch optimizers for simplicity & composability! 🔥🚀💻
A project I made to practice my newfound Neural Network knowledge - I used Python and Numpy to train a network to recognize MNIST images. Adam and mini-batch gradient descent implemented
An Educational Framework Based on PyTorch for Deep Learning Education and Exploration
Generate novel artistic images using neural style transfer algorithm
Here in this system it discloses a log analysis method based on deep learning for an intrusion detection system, which includes the following steps: preprocess the acquired logs of different types in the target system; perform log analysis on the preprocessed logs using a clustering-based method; then, encode the parsed log events into digital f…
MetaPerceptron: Unleashing the Power of Metaheuristic-optimized Multi-Layer Perceptron - A Python Library
implementation of neural network from scratch only using numpy (Conv, Fc, Maxpool, optimizers and activation functions)
ND-Adam is a tailored version of Adam for training DNNs.
Add a description, image, and links to the adam-optimizer topic page so that developers can more easily learn about it.
To associate your repository with the adam-optimizer topic, visit your repo's landing page and select "manage topics."