#
adam
Here are 3 public repositories matching this topic...
ADAS is short for Adaptive Step Size, it's an optimizer that unlike other optimizers that just normalize the derivative, it fine-tunes the step size, truly making step size scheduling obsolete, achieving state-of-the-art training performance
-
Updated
Jan 19, 2021 - C++
Easy-to-use linear and non-linear solver
wrapper
newton
cpp
eigen
pardiso
suitesparse
solvers
nonlinear-optimization
adam
stochastic-gradient-descent
linear-solvers
linesearch
hypre
iterative-solvers
amgcl
lbfgs-optimizer
-
Updated
Sep 12, 2024 - C++
Improve this page
Add a description, image, and links to the adam topic page so that developers can more easily learn about it.
Add this topic to your repo
To associate your repository with the adam topic, visit your repo's landing page and select "manage topics."