Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Fix overlapping samples in DDP when no global seed is set #17713

Merged
merged 8 commits into from
May 31, 2023

Conversation

awaelchli
Copy link
Member

@awaelchli awaelchli commented May 28, 2023

What does this PR do?

This fixes an edge case when the user wants to run with shuffle=True, no global seed set, accelerator="ddp". On master, this situation results in overlapping data. Reported by user Turab Iqbal on Slack (thanks!).

Repro:

import lightning as pl
import torch
​
​
class MyModule(pl.LightningModule):
    def __init__(self):
        super().__init__()
        self.layer = torch.nn.Linear(1, 1)
​
    def configure_optimizers(self):
        return torch.optim.SGD(self.parameters(), lr=0.1)
​
    def training_step(self, batch, _):
        print(f'value: {batch.item()}')
​
​
loader = torch.utils.data.DataLoader(range(16), shuffle=True)
trainer = pl.Trainer(
    accelerator='gpu',
    devices=2,
    strategy='ddp_find_unused_parameters_false',
    enable_progress_bar=False,
    max_epochs=1,
    logger=False,
)
trainer.fit(MyModule(), loader)

Note:

  • This is only a problem with DDP proper. DDP-spawn and fork don't have this issue since they copy the program state and start with the same initial state on all ranks.
  • This is not an issue for Fabric, and the changes here align the logic for both implementations.

cc @Borda @justusschock @awaelchli

@github-actions github-actions bot added the pl Generic label for PyTorch Lightning package label May 28, 2023
@awaelchli awaelchli added bug Something isn't working data handling Generic data-related topic strategy: ddp DistributedDataParallel labels May 28, 2023
@awaelchli awaelchli modified the milestones: 2.0.x, v1.9.x May 28, 2023
@awaelchli awaelchli changed the title Fix overlapping samples in DDP when no global seed is set WIP: Fix overlapping samples in DDP when no global seed is set May 28, 2023
@awaelchli awaelchli marked this pull request as ready for review May 28, 2023 22:10
@github-actions
Copy link
Contributor

github-actions bot commented May 28, 2023

⚡ Required checks status: All passing 🟢

Groups summary

🟢 pytorch_lightning: Tests workflow
Check ID Status
pl-cpu (macOS-11, lightning, 3.8, 1.11) success
pl-cpu (macOS-11, lightning, 3.9, 1.12) success
pl-cpu (macOS-11, lightning, 3.10, 1.13) success
pl-cpu (macOS-11, lightning, 3.10, 2.0) success
pl-cpu (macOS-11, lightning, 3.8, 1.11, oldest) success
pl-cpu (ubuntu-20.04, lightning, 3.8, 1.11) success
pl-cpu (ubuntu-20.04, lightning, 3.9, 1.12) success
pl-cpu (ubuntu-20.04, lightning, 3.10, 1.13) success
pl-cpu (ubuntu-20.04, lightning, 3.10, 2.0) success
pl-cpu (ubuntu-20.04, lightning, 3.8, 1.11, oldest) success
pl-cpu (windows-2022, lightning, 3.8, 1.11) success
pl-cpu (windows-2022, lightning, 3.9, 1.12) success
pl-cpu (windows-2022, lightning, 3.10, 1.13) success
pl-cpu (windows-2022, lightning, 3.10, 2.0) success
pl-cpu (windows-2022, lightning, 3.8, 1.11, oldest) success
pl-cpu (macOS-11, pytorch, 3.8, 1.13) success
pl-cpu (ubuntu-20.04, pytorch, 3.8, 1.13) success
pl-cpu (windows-2022, pytorch, 3.8, 1.13) success

These checks are required after the changes to src/lightning/pytorch/trainer/connectors/data_connector.py, tests/tests_pytorch/trainer/test_dataloaders.py.

🟢 pytorch_lightning: Azure GPU
Check ID Status
pytorch-lightning (GPUs) success

These checks are required after the changes to src/lightning/pytorch/trainer/connectors/data_connector.py, tests/tests_pytorch/trainer/test_dataloaders.py.

🟢 pytorch_lightning: Benchmarks
Check ID Status
lightning.Benchmarks success

These checks are required after the changes to src/lightning/pytorch/trainer/connectors/data_connector.py.

🟢 pytorch_lightning: Docs
Check ID Status
make-doctest (pytorch) success
make-html (pytorch) success

These checks are required after the changes to src/lightning/pytorch/trainer/connectors/data_connector.py.

🟢 mypy
Check ID Status
mypy success

These checks are required after the changes to src/lightning/pytorch/trainer/connectors/data_connector.py.

🟢 install
Check ID Status
install-pkg (ubuntu-22.04, app, 3.8) success
install-pkg (ubuntu-22.04, app, 3.10) success
install-pkg (ubuntu-22.04, fabric, 3.8) success
install-pkg (ubuntu-22.04, fabric, 3.10) success
install-pkg (ubuntu-22.04, pytorch, 3.8) success
install-pkg (ubuntu-22.04, pytorch, 3.10) success
install-pkg (ubuntu-22.04, lightning, 3.8) success
install-pkg (ubuntu-22.04, lightning, 3.10) success
install-pkg (ubuntu-22.04, notset, 3.8) success
install-pkg (ubuntu-22.04, notset, 3.10) success
install-pkg (macOS-12, app, 3.8) success
install-pkg (macOS-12, app, 3.10) success
install-pkg (macOS-12, fabric, 3.8) success
install-pkg (macOS-12, fabric, 3.10) success
install-pkg (macOS-12, pytorch, 3.8) success
install-pkg (macOS-12, pytorch, 3.10) success
install-pkg (macOS-12, lightning, 3.8) success
install-pkg (macOS-12, lightning, 3.10) success
install-pkg (macOS-12, notset, 3.8) success
install-pkg (macOS-12, notset, 3.10) success
install-pkg (windows-2022, app, 3.8) success
install-pkg (windows-2022, app, 3.10) success
install-pkg (windows-2022, fabric, 3.8) success
install-pkg (windows-2022, fabric, 3.10) success
install-pkg (windows-2022, pytorch, 3.8) success
install-pkg (windows-2022, pytorch, 3.10) success
install-pkg (windows-2022, lightning, 3.8) success
install-pkg (windows-2022, lightning, 3.10) success
install-pkg (windows-2022, notset, 3.8) success
install-pkg (windows-2022, notset, 3.10) success

These checks are required after the changes to src/lightning/pytorch/trainer/connectors/data_connector.py.

🟢 link-check
Check ID Status
check-md-links / markdown-link-check success

These checks are required after the changes to src/lightning/pytorch/CHANGELOG.md.


Thank you for your contribution! 💜

Note
This comment is automatically generated and updates for 60 minutes every 180 seconds. If you have any other questions, contact carmocca for help.

@awaelchli awaelchli changed the title WIP: Fix overlapping samples in DDP when no global seed is set Fix overlapping samples in DDP when no global seed is set May 28, 2023
@Borda Borda enabled auto-merge (squash) May 29, 2023 12:14
@mergify mergify bot added the ready PRs ready to be merged label May 29, 2023
@github-actions github-actions bot added the fabric lightning.fabric.Fabric label May 29, 2023
@awaelchli awaelchli force-pushed the bugfix/dist-sampler-shuffle-true branch 2 times, most recently from e8be589 to 64763e8 Compare May 29, 2023 20:24
@github-actions github-actions bot removed the fabric lightning.fabric.Fabric label May 29, 2023
@Borda Borda merged commit 53815e6 into master May 31, 2023
79 checks passed
@Borda Borda deleted the bugfix/dist-sampler-shuffle-true branch May 31, 2023 14:55
Borda pushed a commit that referenced this pull request Jun 2, 2023
Co-authored-by: pre-commit-ci[bot] <66853113+pre-commit-ci[bot]@users.noreply.github.com>

(cherry picked from commit 53815e6)
lantiga pushed a commit that referenced this pull request Jun 2, 2023
Co-authored-by: pre-commit-ci[bot] <66853113+pre-commit-ci[bot]@users.noreply.github.com>

(cherry picked from commit 53815e6)
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug Something isn't working data handling Generic data-related topic pl Generic label for PyTorch Lightning package ready PRs ready to be merged strategy: ddp DistributedDataParallel
Projects
None yet
Development

Successfully merging this pull request may close these issues.

None yet

4 participants