On the Variance of the Adaptive Learning Rate and Beyond
-
Updated
Jul 31, 2021 - Python
On the Variance of the Adaptive Learning Rate and Beyond
Educational deep learning library in plain Numpy.
Short description for quick search
A collection of various gradient descent algorithms implemented in Python from scratch
A compressed adaptive optimizer for training large-scale deep learning models using PyTorch
The project aimed to implement Deep NN / RNN based solution in order to develop flexible methods that are able to adaptively fillin, backfill, and predict time-series using a large number of heterogeneous training datasets.
Reproducing the paper "PADAM: Closing The Generalization Gap of Adaptive Gradient Methods In Training Deep Neural Networks" for the ICLR 2019 Reproducibility Challenge
[Python] [arXiv/cs] Paper "An Overview of Gradient Descent Optimization Algorithms" by Sebastian Ruder
A project I made to practice my newfound Neural Network knowledge - I used Python and Numpy to train a network to recognize MNIST images. Adam and mini-batch gradient descent implemented
Implemenation of DDPG with numpy only (without Tensorflow)
Lookahead optimizer ("Lookahead Optimizer: k steps forward, 1 step back") for tensorflow
Generate novel artistic images using neural style transfer algorithm
Nadir: Cutting-edge PyTorch optimizers for simplicity & composability! 🔥🚀💻
Keras implementation of arXiv 1806.04854
An OOP Deep Neural Network using a similar syntax as Keras with many hyper-parameters, optimizers and activation functions available.
A Deep Learning framework for CNNs and LSTMs from scratch, using NumPy.
Implementation from scratch (using numpy arrays) of a framework based on keras interface which allows to build and train Fully Connected Networks and Convolutional Neural Networks (CNNs).
Adam optimizer with learning rate multipliers for TensorFlow 2.0.
This repo contains implementation of CNN based regression network for Pupil Center Estimation in smartphones
Add a description, image, and links to the adam-optimizer topic page so that developers can more easily learn about it.
To associate your repository with the adam-optimizer topic, visit your repo's landing page and select "manage topics."