Skip to content
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
3 changes: 3 additions & 0 deletions CHANGELOG.md
Original file line number Diff line number Diff line change
Expand Up @@ -4,13 +4,16 @@ All notable changes to this project will be documented in this file.

The format is based on [Keep a Changelog](http://keepachangelog.com/en/1.0.0/).


## [1.3.7] - 2021-06-22

- Fixed a bug where skipping an optimizer while using amp causes amp to trigger an assertion error ([#7975](https://github.com/PyTorchLightning/pytorch-lightning/pull/7975))
- Fixed deprecation messages not showing due to incorrect stacklevel ([#8002](https://github.com/PyTorchLightning/pytorch-lightning/pull/8002), [#8005](https://github.com/PyTorchLightning/pytorch-lightning/pull/8005))
- Fixed setting a `DistributedSampler` when using a distributed plugin in a custom accelerator ([#7814](https://github.com/PyTorchLightning/pytorch-lightning/pull/7814))
- Improved `PyTorchProfiler` chrome traces names ([#8009](https://github.com/PyTorchLightning/pytorch-lightning/pull/8009))
- Fixed moving the best score to device in `EarlyStopping` callback for TPU devices ([#7959](https://github.com/PyTorchLightning/pytorch-lightning/pull/7959))
- Fixed backward compatibility of moved functions `rank_zero_warn` and `rank_zero_deprecation` ([#8085](https://github.com/PyTorchLightning/pytorch-lightning/pull/8085))


## [1.3.6] - 2021-06-15

Expand Down
2 changes: 1 addition & 1 deletion pytorch_lightning/__about__.py
Original file line number Diff line number Diff line change
@@ -1,7 +1,7 @@
import time

_this_year = time.strftime("%Y")
__version__ = '1.3.7'
__version__ = '1.3.7post0'
__author__ = 'William Falcon et al.'
__author_email__ = 'waf2107@columbia.edu'
__license__ = 'Apache-2.0'
Expand Down
18 changes: 18 additions & 0 deletions pytorch_lightning/utilities/distributed.py
Original file line number Diff line number Diff line change
Expand Up @@ -65,6 +65,24 @@ def _get_rank() -> int:
rank_zero_only.rank = getattr(rank_zero_only, 'rank', _get_rank())


def rank_zero_warn(*args, stacklevel: int = 5, **kwargs):
from pytorch_lightning.utilities.warnings import rank_zero_deprecation, rank_zero_warn
rank_zero_deprecation(
'`pytorch_lightning.utilities.distributed.rank_zero_warn` has been moved to'
' `pytorch_lightning.utilities.rank_zero_warn` in v1.3.7 and will be removed in v1.6'
)
return rank_zero_warn(*args, stacklevel=stacklevel, **kwargs)


def rank_zero_deprecation(*args, stacklevel: int = 5, **kwargs):
from pytorch_lightning.utilities.warnings import rank_zero_deprecation
rank_zero_deprecation(
'`pytorch_lightning.utilities.distributed.rank_zero_deprecation` has been moved to'
' `pytorch_lightning.utilities.rank_zero_deprecation` in v1.3.7 and will be removed in v1.6'
)
return rank_zero_deprecation(*args, stacklevel=stacklevel, **kwargs)


def _info(*args, stacklevel: int = 2, **kwargs):
if python_version() >= "3.8.0":
kwargs['stacklevel'] = stacklevel
Expand Down