Bayesian hyperparameters optimization for neural networks
-
Updated
Dec 20, 2016 - Python
Bayesian hyperparameters optimization for neural networks
Python library for neural networks.
A simple implementation of style transfer
Modified version of the YellowFin optimizer for TensorFlow to work with the Keras API [not actively maintained]
PyTorch implementation of Neural Optimizer Search's Optimizer_1
machine learning and data analysis wrote in python
PyTorch implementation of AddSign and PowerSign optimizers presented in 'Neural Optimizer Search with Reinforcement Learning'
How optimizer and learning rate choice affects training performance
Comparative study of FTML optimizer done as part of project work for Statistical Methods in Artificial Intelligence, IIITH
PyTorch Impl. of Prediction Optimizer (to stabilize GAN training)
TensorFlow implementation of entropy SGD
Generalization of Adam, AdaMax, AMSGrad algorithms for PyTorch
noise gradient descent optimizer by tensorflow,you can change the optimizer as you want
Implementation of some new techniques from fastai and other papers which works with keras models
A tensorflow/keras version of AdaBound optimizer.
The code for the ICLR 2018 paper "Stabilizing Adversarial Nets With Prediction Methods"
An optimizer that trains as fast as Adam and as good as SGD in Tensorflow
Add a description, image, and links to the optimizer topic page so that developers can more easily learn about it.
To associate your repository with the optimizer topic, visit your repo's landing page and select "manage topics."