Skip to content

j-w-yun/optimizer-visualization

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

19 Commits
 
 
 
 
 
 
 
 
 
 

Repository files navigation

optimizer-visualization

Visualize gradient descent optimization algorithms in Tensorflow.

All methods start at the same location, specified by two variables. Both x and y variables are improved by the following Optimizers:

Adadelta documentation

Adagrad documentation

Adam documentation

Ftrl documentation

GD documentation

Momentum documentation

RMSProp documentation

For an overview of each gradient descent optimization algorithms, visit this helpful resource.

Numbers in figure legend indicate learning rate, specific to each Optimizer.

Note the optimizers' behavior when gradient is steep.

Note the optimizers' behavior when initial gradient is miniscule.

Inspired by the following GIFs:

From here