Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Create barrier without timeout in prepare_data() #19448

Merged
merged 23 commits into from Feb 13, 2024

Conversation

awaelchli
Copy link
Member

@awaelchli awaelchli commented Feb 12, 2024

What does this PR do?

Fixes #19266

LightningModule and LightningDataModule have a hook prepare_data() that can be used to run preprocessing and data downloads in case of multiprocessing/multi-GPU, where the hook only runs on local rank 0 to avoid racing conditions etc. A long standing issue has been that this hook is subject to the collective timeout setting by the world process group (30 minutes by default). If you processing code takes longer than 30 minutes to complete, you would not be able to use the prepare_data mechanism. The equivalent in Fabric is the Fabric.rank_zero_first() context manager, which has the same problem.

This PR introduces an "infinite" barrier that will not time out and is used exclusively around the prepare_data() hook (and rank_zero_first() in Fabric).

What the Trainer did before:

if trainer.local_rank == 0:
    datamodule.prepare_data()
barrier()  # the normal barrier has a 30 min timeout

What it does now:

with _InfiniteBarrier():
    if trainer.local_rank == 0:
        datamodule.prepare_data()
# < ------- barrier at end of this context without timeout

I have verified this works in multi-node jobs with Lightning Studio by taking a standard trainer example and implementing prepare_data() with a 40 minute sleep on rank 0. Using the main branch, we see that the jobs timeout and fail after ~30 minutes:

timeout
[E ProcessGroupNCCL.cpp:475] [Rank 1] Watchdog caught collective operation timeout: WorkNCCL(SeqNum=3, OpType=ALLREDUCE, NumelIn=1, NumelOut=1, Timeout(ms)=1800000) ran for 1800420 milliseconds before timing out.
[E ProcessGroupNCCL.cpp:489] Some NCCL operations have failed or timed out. Due to the asynchronous nature of CUDA kernels, subsequent GPU operations might run on corrupted/incomplete data.
[E ProcessGroupNCCL.cpp:495] To avoid data inconsistency, we are taking the entire process down.
[E ProcessGroupNCCL.cpp:916] [Rank 1] NCCL watchdog thread terminated with exception: [Rank 1] Watchdog caught collective operation timeout: WorkNCCL(SeqNum=3, OpType=ALLREDUCE, NumelIn=1, NumelOut=1, Timeout(ms)=1800000) ran for 1800420 milliseconds before timing out.
terminate called after throwing an instance of 'std::runtime_error'
  what():  [Rank 1] NCCL watchdog thread terminated with exception: [Rank 1] Watchdog caught collective operation timeout: WorkNCCL(SeqNum=3, OpType=ALLREDUCE, NumelIn=1, NumelOut=1, Timeout(ms)=1800000) ran for 1800420 milliseconds before timing out.
Aborted (core dumped)

Whereas with the implementation in this branch, the 40 minute sleep finishes, processes meet at the barrier, and training starts:

no-timeout

📚 Documentation preview 📚: https://pytorch-lightning--19448.org.readthedocs.build/en/19448/

cc @Borda @awaelchli @carmocca @justusschock

@awaelchli awaelchli added fun Staff contributions outside working hours - to differentiate from the "community" label feature Is an improvement or enhancement labels Feb 12, 2024
@awaelchli awaelchli added this to the 2.3 milestone Feb 12, 2024
@github-actions github-actions bot added fabric lightning.fabric.Fabric pl Generic label for PyTorch Lightning package labels Feb 12, 2024
@awaelchli awaelchli added distributed Generic distributed-related topic and removed fabric lightning.fabric.Fabric pl Generic label for PyTorch Lightning package labels Feb 12, 2024
@github-actions github-actions bot added fabric lightning.fabric.Fabric pl Generic label for PyTorch Lightning package labels Feb 12, 2024
@awaelchli awaelchli marked this pull request as ready for review February 12, 2024 13:43
Copy link
Contributor

github-actions bot commented Feb 12, 2024

⚡ Required checks status: All passing 🟢

Groups summary

🟢 pytorch_lightning: Tests workflow
Check ID Status
pl-cpu (macOS-11, lightning, 3.8, 1.13, oldest) success
pl-cpu (macOS-11, lightning, 3.10, 1.13) success
pl-cpu (macOS-11, lightning, 3.10, 2.1) success
pl-cpu (ubuntu-20.04, lightning, 3.8, 1.13, oldest) success
pl-cpu (ubuntu-20.04, lightning, 3.10, 1.13) success
pl-cpu (ubuntu-20.04, lightning, 3.10, 2.1) success
pl-cpu (windows-2022, lightning, 3.8, 1.13, oldest) success
pl-cpu (windows-2022, lightning, 3.10, 1.13) success
pl-cpu (windows-2022, lightning, 3.10, 2.1) success
pl-cpu (macOS-11, pytorch, 3.8, 1.13) success
pl-cpu (ubuntu-20.04, pytorch, 3.8, 1.13) success
pl-cpu (windows-2022, pytorch, 3.8, 1.13) success
pl-cpu (macOS-12, pytorch, 3.11, 2.0) success
pl-cpu (macOS-12, pytorch, 3.11, 2.1) success
pl-cpu (ubuntu-22.04, pytorch, 3.11, 2.0) success
pl-cpu (ubuntu-22.04, pytorch, 3.11, 2.1) success
pl-cpu (windows-2022, pytorch, 3.11, 2.0) success
pl-cpu (windows-2022, pytorch, 3.11, 2.1) success

These checks are required after the changes to src/lightning/fabric/fabric.py, src/lightning/fabric/utilities/distributed.py, src/lightning/pytorch/trainer/connectors/data_connector.py, tests/tests_pytorch/core/test_datamodules.py, tests/tests_pytorch/models/test_hooks.py.

🟢 pytorch_lightning: Azure GPU
Check ID Status
pytorch-lightning (GPUs) (testing Lightning | latest) success
pytorch-lightning (GPUs) (testing PyTorch | latest) success

These checks are required after the changes to src/lightning/pytorch/trainer/connectors/data_connector.py, tests/tests_pytorch/core/test_datamodules.py, tests/tests_pytorch/models/test_hooks.py, src/lightning/fabric/fabric.py, src/lightning/fabric/utilities/distributed.py.

🟢 pytorch_lightning: Benchmarks
Check ID Status
lightning.Benchmarks success

These checks are required after the changes to src/lightning/fabric/fabric.py, src/lightning/fabric/utilities/distributed.py, src/lightning/pytorch/trainer/connectors/data_connector.py.

🟢 fabric: Docs
Check ID Status
docs-make (fabric, doctest) success
docs-make (fabric, html) success

These checks are required after the changes to src/lightning/fabric/fabric.py, src/lightning/fabric/utilities/distributed.py.

🟢 pytorch_lightning: Docs
Check ID Status
docs-make (pytorch, doctest) success
docs-make (pytorch, html) success

These checks are required after the changes to src/lightning/pytorch/trainer/connectors/data_connector.py.

🟢 lightning_fabric: CPU workflow
Check ID Status
fabric-cpu (macOS-11, lightning, 3.8, 1.13, oldest) success
fabric-cpu (macOS-11, lightning, 3.10, 1.13) success
fabric-cpu (macOS-11, lightning, 3.11, 2.1) success
fabric-cpu (ubuntu-20.04, lightning, 3.8, 1.13, oldest) success
fabric-cpu (ubuntu-20.04, lightning, 3.10, 1.13) success
fabric-cpu (ubuntu-20.04, lightning, 3.11, 2.1) success
fabric-cpu (windows-2022, lightning, 3.8, 1.13, oldest) success
fabric-cpu (windows-2022, lightning, 3.10, 1.13) success
fabric-cpu (windows-2022, lightning, 3.11, 2.1) success
fabric-cpu (macOS-11, fabric, 3.8, 1.13) success
fabric-cpu (ubuntu-20.04, fabric, 3.8, 1.13) success
fabric-cpu (windows-2022, fabric, 3.8, 1.13) success
fabric-cpu (macOS-12, fabric, 3.11, 2.0) success
fabric-cpu (macOS-12, fabric, 3.11, 2.1) success
fabric-cpu (ubuntu-22.04, fabric, 3.11, 2.0) success
fabric-cpu (ubuntu-22.04, fabric, 3.11, 2.1) success
fabric-cpu (windows-2022, fabric, 3.11, 2.0) success
fabric-cpu (windows-2022, fabric, 3.11, 2.1) success

These checks are required after the changes to src/lightning/fabric/fabric.py, src/lightning/fabric/utilities/distributed.py, tests/tests_fabric/test_fabric.py, tests/tests_fabric/utilities/test_distributed.py.

🟢 lightning_fabric: Azure GPU
Check ID Status
lightning-fabric (GPUs) (testing Fabric | latest) success
lightning-fabric (GPUs) (testing Lightning | latest) success

These checks are required after the changes to src/lightning/fabric/fabric.py, src/lightning/fabric/utilities/distributed.py, tests/tests_fabric/test_fabric.py, tests/tests_fabric/utilities/test_distributed.py.

🟢 mypy
Check ID Status
mypy success

These checks are required after the changes to src/lightning/fabric/fabric.py, src/lightning/fabric/utilities/distributed.py, src/lightning/pytorch/trainer/connectors/data_connector.py.

🟢 install
Check ID Status
install-pkg (ubuntu-22.04, app, 3.8) success
install-pkg (ubuntu-22.04, app, 3.11) success
install-pkg (ubuntu-22.04, fabric, 3.8) success
install-pkg (ubuntu-22.04, fabric, 3.11) success
install-pkg (ubuntu-22.04, pytorch, 3.8) success
install-pkg (ubuntu-22.04, pytorch, 3.11) success
install-pkg (ubuntu-22.04, lightning, 3.8) success
install-pkg (ubuntu-22.04, lightning, 3.11) success
install-pkg (ubuntu-22.04, notset, 3.8) success
install-pkg (ubuntu-22.04, notset, 3.11) success
install-pkg (macOS-12, app, 3.8) success
install-pkg (macOS-12, app, 3.11) success
install-pkg (macOS-12, fabric, 3.8) success
install-pkg (macOS-12, fabric, 3.11) success
install-pkg (macOS-12, pytorch, 3.8) success
install-pkg (macOS-12, pytorch, 3.11) success
install-pkg (macOS-12, lightning, 3.8) success
install-pkg (macOS-12, lightning, 3.11) success
install-pkg (macOS-12, notset, 3.8) success
install-pkg (macOS-12, notset, 3.11) success
install-pkg (windows-2022, app, 3.8) success
install-pkg (windows-2022, app, 3.11) success
install-pkg (windows-2022, fabric, 3.8) success
install-pkg (windows-2022, fabric, 3.11) success
install-pkg (windows-2022, pytorch, 3.8) success
install-pkg (windows-2022, pytorch, 3.11) success
install-pkg (windows-2022, lightning, 3.8) success
install-pkg (windows-2022, lightning, 3.11) success
install-pkg (windows-2022, notset, 3.8) success
install-pkg (windows-2022, notset, 3.11) success

These checks are required after the changes to src/lightning/fabric/fabric.py, src/lightning/fabric/utilities/distributed.py, src/lightning/pytorch/trainer/connectors/data_connector.py.


Thank you for your contribution! 💜

Note
This comment is automatically generated and updates for 60 minutes every 180 seconds. If you have any other questions, contact carmocca for help.

Copy link

codecov bot commented Feb 12, 2024

Codecov Report

Merging #19448 (e27d7f1) into master (2ed7282) will decrease coverage by 30%.
Report is 4 commits behind head on master.
The diff coverage is 100%.

Additional details and impacted files
@@            Coverage Diff             @@
##           master   #19448      +/-   ##
==========================================
- Coverage      83%      53%     -30%     
==========================================
  Files         452      446       -6     
  Lines       38136    37967     -169     
==========================================
- Hits        31784    20268   -11516     
- Misses       6352    17699   +11347     

@mergify mergify bot added the ready PRs ready to be merged label Feb 13, 2024
@carmocca carmocca merged commit 3c5a465 into master Feb 13, 2024
113 checks passed
@carmocca carmocca deleted the feature/infinite-barrier branch February 13, 2024 11:10
jojje added a commit to jojje/pytorch-lightning that referenced this pull request Mar 5, 2024
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
distributed Generic distributed-related topic fabric lightning.fabric.Fabric feature Is an improvement or enhancement fun Staff contributions outside working hours - to differentiate from the "community" label pl Generic label for PyTorch Lightning package ready PRs ready to be merged
Projects
None yet
Development

Successfully merging this pull request may close these issues.

Timeout while waiting for prepare_data to finish
4 participants