Skip to content

Commit

Permalink
[BE][FSDP] Change the logging level to info (#126362)
Browse files Browse the repository at this point in the history
As title

Differential Revision: [D57419445](https://our.internmc.facebook.com/intern/diff/D57419445/)

Pull Request resolved: #126362
Approved by: https://github.com/awgu, https://github.com/Skylion007
  • Loading branch information
fegin authored and ZelboK committed May 19, 2024
1 parent 272b119 commit 667af78
Show file tree
Hide file tree
Showing 2 changed files with 2 additions and 2 deletions.
2 changes: 1 addition & 1 deletion torch/distributed/fsdp/_debug_utils.py
Original file line number Diff line number Diff line change
Expand Up @@ -57,7 +57,7 @@ def dump_and_reset(cls, msg: str) -> None:
# This cannot be combined with DETAIL distributed log
# as the profiling will be very incorrect.
if dist.get_rank() == 0 and dist.get_debug_level() == dist.DebugLevel.INFO:
logger.warning("%s %s", msg, cls.results)
logger.info("%s %s", msg, cls.results)
cls.reset()


Expand Down
2 changes: 1 addition & 1 deletion torch/distributed/fsdp/_optim_utils.py
Original file line number Diff line number Diff line change
Expand Up @@ -1511,7 +1511,7 @@ def _allgather_orig_param_states(
"""
fsdp_state = fsdp_param_info.state
if fsdp_state.rank == 0 and dist.get_debug_level() == dist.DebugLevel.DETAIL:
logger.warning(
logger.info(
"Memory Summary before calling to _allgather_orig_param_states %s",
fsdp_state._device_handle.memory_summary(),
)
Expand Down

0 comments on commit 667af78

Please sign in to comment.