Quasi Hyperbolic Rectified DEMON Adam/Amsgrad with AdaMod, Gradient Centralization, Lookahead, iterative averaging and decorrelated Weight Decay
optimizer
ranger
adam
demon
lookahead
amsgrad
adamw
radam
adamod
gradient-centralization
qhadam
decay-momentum
iterate-averaging
qhranger
adaptive-optimizer
hypergradient-descent
hd-adam
hd-sgd
nosadam
nostalgic-adam
-
Updated
Sep 23, 2020 - Python