Skip to content

University project on the optimization of neural networks through hyper gradient based algorithms.

Notifications You must be signed in to change notification settings

GiovanniPoli/OptimzersHD

Repository files navigation

Optimzers HD

University project on the optimization of neural networks through hypergradient based algorithms.

alt text

Comparison between algorithms on a fully connected neural network (MINST dataset)

The original paper is from Atılım Güneş Baydin, Robert Cornish, David Martı́nez Rubio, Mark Schmidt, and Frank Wood (2018). Online learning rate adaptation with hypergradient descent.

Link to the original paper: link

Link to the original author's github: link github

Externals links

About

University project on the optimization of neural networks through hyper gradient based algorithms.

Topics

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages