Skip to content

This is the code for "The Evolution of Gradient Descent" by Siraj Raval on Youtube

Notifications You must be signed in to change notification settings

Hulalazz/The_evolution_of_gradient_descent

 
 

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

5 Commits
 
 
 
 
 
 
 
 

Repository files navigation

The_evolution_of_gradient_descent

This is the code for "The Evolution of Gradient Descent" by Siraj Raval on Youtube

Coding Challenge - Due Date, Thursday June 8th at 12 PM PST

This weeks coding challenge is to write out the Adam optimization strategy from scratch. In the process you'll learn about all the other gradient descent variants and why Adam works so well. Bonus points if you add a visual element to it by plotting it in a Jupyter notebook. Good luck!

Overview

This is the code for this videon on Youtube by Siraj Raval. In the video, we go over the different optimizer options that Tensorflow gives us. Under the hood, they are all variants of gradient descent.

Dependencies

  • matplotlib
  • pyplot
  • numpy

install missing dependencies with pip

Usage

Run jupyter notebook to see the code that compares gradient descent to stochastic gradient descent run in the browser. I've also got 2 seperate python files, one for adadelta and one for the nesterov method. Run those straight from terminal with the python command.

Credits

The credits for this code go to GRYE and dtnewman. I've merely created a wrapper to get people started.

About

This is the code for "The Evolution of Gradient Descent" by Siraj Raval on Youtube

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages

  • Jupyter Notebook 97.7%
  • Python 2.3%