Skip to content

Commit

Permalink
Merge pull request #433 from nasyxx:patch-1
Browse files Browse the repository at this point in the history
PiperOrigin-RevId: 478994245
  • Loading branch information
OptaxDev committed Oct 5, 2022
2 parents 232c48d + 71e7455 commit e026a15
Show file tree
Hide file tree
Showing 2 changed files with 20 additions and 0 deletions.
14 changes: 14 additions & 0 deletions docs/api.rst
Original file line number Diff line number Diff line change
Expand Up @@ -10,6 +10,8 @@ Common Optimizers
adagrad
adam
adamw
adamax
adamaxw
fromage
lamb
lars
Expand Down Expand Up @@ -43,6 +45,16 @@ Adam

.. autofunction:: adam

Adamax
~~~~

.. autofunction:: adamax

AdamaxW
~~~~~

.. autofunction:: adamaxw

AdamW
~~~~~

Expand Down Expand Up @@ -147,6 +159,7 @@ Gradient Transforms
Params
scale
scale_by_adam
scale_by_adamax
scale_by_belief
scale_by_factored_rms
scale_by_optimistic_gradient
Expand Down Expand Up @@ -257,6 +270,7 @@ Optax Transforms and States

.. autofunction:: scale
.. autofunction:: scale_by_adam
.. autofunction:: scale_by_adamax
.. autofunction:: scale_by_belief
.. autofunction:: scale_by_factored_rms
.. autofunction:: scale_by_param_block_norm
Expand Down
6 changes: 6 additions & 0 deletions optax/__init__.py
Original file line number Diff line number Diff line change
Expand Up @@ -19,6 +19,8 @@
from optax._src.alias import adafactor
from optax._src.alias import adagrad
from optax._src.alias import adam
from optax._src.alias import adamax
from optax._src.alias import adamaxw
from optax._src.alias import adamw
from optax._src.alias import dpsgd
from optax._src.alias import fromage
Expand Down Expand Up @@ -122,6 +124,7 @@
from optax._src.transform import EmaState
from optax._src.transform import scale
from optax._src.transform import scale_by_adam
from optax._src.transform import scale_by_adamax
from optax._src.transform import scale_by_belief
from optax._src.transform import scale_by_optimistic_gradient
from optax._src.transform import scale_by_param_block_norm
Expand Down Expand Up @@ -175,6 +178,8 @@
"adafactor",
"adagrad",
"adam",
"adamax",
"adamaxw",
"adamw",
"adaptive_grad_clip",
"AdaptiveGradClipState",
Expand Down Expand Up @@ -267,6 +272,7 @@
"safe_root_mean_squares",
"ScalarOrSchedule",
"scale_by_adam",
"scale_by_adamax",
"scale_by_belief",
"scale_by_factored_rms",
"scale_by_param_block_norm",
Expand Down

0 comments on commit e026a15

Please sign in to comment.