Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Expose adamax and adamaxw #433

Merged
merged 2 commits into from
Oct 5, 2022
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
14 changes: 14 additions & 0 deletions docs/api.rst
Original file line number Diff line number Diff line change
Expand Up @@ -10,6 +10,8 @@ Common Optimizers
adagrad
adam
adamw
adamax
adamaxw
fromage
lamb
lars
Expand Down Expand Up @@ -48,6 +50,16 @@ AdamW

.. autofunction:: adamw

Adamax
~~~~

.. autofunction:: adamax

AdamaxW
~~~~~

.. autofunction:: adamaxw

Fromage
~~~~~~~

Expand Down Expand Up @@ -147,6 +159,7 @@ Gradient Transforms
Params
scale
scale_by_adam
scale_by_adamax
scale_by_belief
scale_by_factored_rms
scale_by_optimistic_gradient
Expand Down Expand Up @@ -257,6 +270,7 @@ Optax Transforms and States

.. autofunction:: scale
.. autofunction:: scale_by_adam
.. autofunction:: scale_by_adamax
.. autofunction:: scale_by_belief
.. autofunction:: scale_by_factored_rms
.. autofunction:: scale_by_param_block_norm
Expand Down
4 changes: 4 additions & 0 deletions optax/__init__.py
Original file line number Diff line number Diff line change
Expand Up @@ -20,6 +20,8 @@
from optax._src.alias import adagrad
from optax._src.alias import adam
from optax._src.alias import adamw
from optax._src.alias import adamax
from optax._src.alias import adamaxw
from optax._src.alias import dpsgd
from optax._src.alias import fromage
from optax._src.alias import lamb
Expand Down Expand Up @@ -176,6 +178,8 @@
"adagrad",
"adam",
"adamw",
"adamax",
"adamaxw",
"adaptive_grad_clip",
"AdaptiveGradClipState",
"add_decayed_weights",
Expand Down