Skip to content

Commit

Permalink
Update docs
Browse files Browse the repository at this point in the history
  • Loading branch information
tjruwase committed Aug 22, 2023
1 parent b067751 commit 49e8153
Showing 1 changed file with 2 additions and 2 deletions.
4 changes: 2 additions & 2 deletions docs/code-docs/source/zero3.rst
Original file line number Diff line number Diff line change
Expand Up @@ -387,15 +387,15 @@ Sometimes, a user may want to modify parameters or optimizer states outside of t
.. autofunction:: deepspeed.utils.safe_set_full_optimizer_state


These routines can be used at any point after `deepspeed.initialize()` as shown in the following snippet.
These routines can be used at any point after initialization of the DeepSpeed engine (i.e., ``deepspeed.initialize()``) as shown in the following snippet.

.. code-block:: python
[...]
from deepspeed.utils import safe_set_full_fp32_param, safe_set_full_optimizer_state
# Here is an example to zero all the fp32 parameters and optimizer states.
for n, lp in model.named_parameters():
# Assume zero stage 1 and 2, since stage 3 requires a gather to assemble lp
# Assume zero stage 1 or 2, since stage 3 requires a gather to assemble lp
zero_tensor = torch.zeros_like(lp)
hp = safe_set_full_fp32_param(lp, zero_tensor)
Expand Down

0 comments on commit 49e8153

Please sign in to comment.