Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Meta device initialization for FSDP in Trainer #18385

Merged
merged 16 commits into from
Aug 25, 2023
Merged

Conversation

awaelchli
Copy link
Member

@awaelchli awaelchli commented Aug 24, 2023

What does this PR do?

Same approach as in #18122

trainer = Trainer(strategy="fsdp")

with trainer.init_module(empty_init=True):
    model = Model() # the model is now on the meta device (no memory occupied)
...
# materialization and param init and sharding happens in `strategy.setup`
trainer.fit(model)

This allows you to instantiate very large models that wouldn't fit in memory (either CPU or GPU) as fast as possible. No memory for weights get allocated, neither in CPU nor GPU memory and parameters are materialized/initialized with random weights directly at the time the model gets wrapped and sharded in FSDPStrategy.setup().

Notes:

  • This new feature is possible thanks to [RFC] Revisiting Meta Device Initialization with reset_parameters() pytorch/pytorch#104187 in PyTorch 2.1 nightly.
    Requirement: Your submodules define a reset_parameters() method that can be called to init the params. This is the case for all (most) built-in PyTorch layers. If you have a custom layer, you'd have to add that method. This PR also updates our Trainsformers example that needs this reset_parameters() fixed.
  • Documentation will be updated in a follow-up, both for Fabric and Trainer.

cc @Borda @awaelchli @carmocca

@awaelchli awaelchli added this to the 2.1 milestone Aug 24, 2023
@awaelchli awaelchli added feature Is an improvement or enhancement strategy: fsdp Fully Sharded Data Parallel pl Generic label for PyTorch Lightning package labels Aug 24, 2023
@github-actions github-actions bot added the pl Generic label for PyTorch Lightning package label Aug 24, 2023
@awaelchli awaelchli marked this pull request as ready for review August 24, 2023 23:48
@github-actions
Copy link
Contributor

github-actions bot commented Aug 24, 2023

⚡ Required checks status: All passing 🟢

Groups summary

🟢 pytorch_lightning: Tests workflow
Check ID Status
pl-cpu (macOS-11, lightning, 3.8, 1.11) success
pl-cpu (macOS-11, lightning, 3.9, 1.12) success
pl-cpu (macOS-11, lightning, 3.10, 1.13) success
pl-cpu (macOS-11, lightning, 3.10, 2.0) success
pl-cpu (macOS-11, lightning, 3.8, 1.11, oldest) success
pl-cpu (ubuntu-20.04, lightning, 3.8, 1.11) success
pl-cpu (ubuntu-20.04, lightning, 3.9, 1.12) success
pl-cpu (ubuntu-20.04, lightning, 3.10, 1.13) success
pl-cpu (ubuntu-20.04, lightning, 3.10, 2.0) success
pl-cpu (ubuntu-20.04, lightning, 3.8, 1.11, oldest) success
pl-cpu (windows-2022, lightning, 3.8, 1.11) success
pl-cpu (windows-2022, lightning, 3.9, 1.12) success
pl-cpu (windows-2022, lightning, 3.10, 1.13) success
pl-cpu (windows-2022, lightning, 3.10, 2.0) success
pl-cpu (windows-2022, lightning, 3.8, 1.11, oldest) success
pl-cpu (macOS-11, pytorch, 3.8, 1.13) success
pl-cpu (ubuntu-20.04, pytorch, 3.8, 1.13) success
pl-cpu (windows-2022, pytorch, 3.8, 1.13) success

These checks are required after the changes to src/lightning/pytorch/demos/transformer.py, src/lightning/pytorch/strategies/fsdp.py, tests/tests_pytorch/strategies/test_fsdp.py.

🟢 pytorch_lightning: Azure GPU
Check ID Status
[pytorch-lightning (GPUs) (testing Lightning latest)](https://dev.azure.com/Lightning-AI/72ab7ed8-b00f-4b6e-b131-3388f7ffafa7/_build/results?buildId=171443&view=logs&jobId=47e66f3c-897a-5428-da11-bf5c7745762e) success
[pytorch-lightning (GPUs) (testing PyTorch latest)](https://dev.azure.com/Lightning-AI/72ab7ed8-b00f-4b6e-b131-3388f7ffafa7/_build/results?buildId=171443&view=logs&jobId=3f274fac-2e11-54ca-487e-194c91f3ae9f) success

These checks are required after the changes to src/lightning/pytorch/demos/transformer.py, src/lightning/pytorch/strategies/fsdp.py, tests/tests_pytorch/strategies/test_fsdp.py.

🟢 pytorch_lightning: Benchmarks
Check ID Status
lightning.Benchmarks success

These checks are required after the changes to src/lightning/pytorch/demos/transformer.py, src/lightning/pytorch/strategies/fsdp.py.

🟢 pytorch_lightning: Docs
Check ID Status
docs-checks (pytorch, doctest) success
make-html (pytorch) success

These checks are required after the changes to src/lightning/pytorch/demos/transformer.py, src/lightning/pytorch/strategies/fsdp.py.

🟢 mypy
Check ID Status
mypy success

These checks are required after the changes to src/lightning/pytorch/demos/transformer.py, src/lightning/pytorch/strategies/fsdp.py.

🟢 install
Check ID Status
install-pkg (ubuntu-22.04, app, 3.8) success
install-pkg (ubuntu-22.04, app, 3.10) success
install-pkg (ubuntu-22.04, fabric, 3.8) success
install-pkg (ubuntu-22.04, fabric, 3.10) success
install-pkg (ubuntu-22.04, pytorch, 3.8) success
install-pkg (ubuntu-22.04, pytorch, 3.10) success
install-pkg (ubuntu-22.04, lightning, 3.8) success
install-pkg (ubuntu-22.04, lightning, 3.10) success
install-pkg (ubuntu-22.04, notset, 3.8) success
install-pkg (ubuntu-22.04, notset, 3.10) success
install-pkg (macOS-12, app, 3.8) success
install-pkg (macOS-12, app, 3.10) success
install-pkg (macOS-12, fabric, 3.8) success
install-pkg (macOS-12, fabric, 3.10) success
install-pkg (macOS-12, pytorch, 3.8) success
install-pkg (macOS-12, pytorch, 3.10) success
install-pkg (macOS-12, lightning, 3.8) success
install-pkg (macOS-12, lightning, 3.10) success
install-pkg (macOS-12, notset, 3.8) success
install-pkg (macOS-12, notset, 3.10) success
install-pkg (windows-2022, app, 3.8) success
install-pkg (windows-2022, app, 3.10) success
install-pkg (windows-2022, fabric, 3.8) success
install-pkg (windows-2022, fabric, 3.10) success
install-pkg (windows-2022, pytorch, 3.8) success
install-pkg (windows-2022, pytorch, 3.10) success
install-pkg (windows-2022, lightning, 3.8) success
install-pkg (windows-2022, lightning, 3.10) success
install-pkg (windows-2022, notset, 3.8) success
install-pkg (windows-2022, notset, 3.10) success

These checks are required after the changes to src/lightning/pytorch/demos/transformer.py, src/lightning/pytorch/strategies/fsdp.py.


Thank you for your contribution! 💜

Note
This comment is automatically generated and updates for 60 minutes every 180 seconds. If you have any other questions, contact carmocca for help.

src/lightning/pytorch/strategies/fsdp.py Outdated Show resolved Hide resolved
@mergify mergify bot added the ready PRs ready to be merged label Aug 25, 2023
Co-authored-by: Jirka Borovec <6035284+Borda@users.noreply.github.com>
src/lightning/pytorch/strategies/fsdp.py Outdated Show resolved Hide resolved
src/lightning/pytorch/strategies/fsdp.py Outdated Show resolved Hide resolved
@awaelchli awaelchli merged commit 3d41313 into master Aug 25, 2023
82 checks passed
@awaelchli awaelchli deleted the feature/fsdp-meta branch August 25, 2023 18:18
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
feature Is an improvement or enhancement pl Generic label for PyTorch Lightning package ready PRs ready to be merged strategy: fsdp Fully Sharded Data Parallel
Projects
None yet
Development

Successfully merging this pull request may close these issues.

None yet

3 participants