Skip to content

Latest commit

 

History

History
78 lines (46 loc) · 3.15 KB

optimizers.rst

File metadata and controls

78 lines (46 loc) · 3.15 KB

Optimizers

Stochastic Gradiend Descent (SGD)

RMSprop

Adam (ADAptive Moment estimation)

Warning

Adam implementation appears to be rendering lower results than tf.keras.optimizers.Adam implementation. Further debugging is required.

KINGMA2015

Diederik P. Kingma, Jimmy Ba "Adam: A Method for Stochastic Optimization" arXiv:1412.6980 LG cs, 2015. Available: https://arxiv.org/abs/1412.6980