From 45dea32f30182f467050836e568204d07ceb5372 Mon Sep 17 00:00:00 2001 From: Yurii Shevchuk Date: Mon, 10 Dec 2018 12:19:57 +0100 Subject: [PATCH] changed order for the training agorithms --- site/pages/cheatsheet.rst | 12 ++++++------ 1 file changed, 6 insertions(+), 6 deletions(-) diff --git a/site/pages/cheatsheet.rst b/site/pages/cheatsheet.rst index 7b67f185..c7e51b88 100644 --- a/site/pages/cheatsheet.rst +++ b/site/pages/cheatsheet.rst @@ -21,19 +21,19 @@ Training algorithms :header: "Class name", "Name" :network:`GradientDescent`, Gradient Descent + :network:`Momentum`, Momentum + :network:`Adam`, Adam + :network:`Adamax`, AdaMax + :network:`RMSProp`, RMSProp + :network:`Adadelta`, Adadelta + :network:`Adagrad`, Adagrad :network:`ConjugateGradient`, Conjugate Gradient :network:`QuasiNewton`, quasi-Newton :network:`LevenbergMarquardt`, Levenberg-Marquardt :network:`Hessian`, Hessian :network:`HessianDiagonal`, Hessian diagonal - :network:`Momentum`, Momentum :network:`RPROP`, RPROP :network:`IRPROPPlus`, iRPROP+ - :network:`Adadelta`, Adadelta - :network:`Adagrad`, Adagrad - :network:`RMSProp`, RMSProp - :network:`Adam`, Adam - :network:`Adamax`, AdaMax Regularization methods ++++++++++++++++++++++