Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Bump pytorch-lightning from 1.6.5 to 1.7.0 in /requirements #697

Merged

Conversation

dependabot[bot]
Copy link
Contributor

@dependabot dependabot bot commented on behalf of github Aug 2, 2022

Bumps pytorch-lightning from 1.6.5 to 1.7.0.

Release notes

Sourced from pytorch-lightning's releases.

PyTorch Lightning 1.7: Apple Silicon support, Native FSDP, Collaborative training, and multi-GPU support with Jupyter notebooks

The core team is excited to announce the release of PyTorch Lightning 1.7 ⚡

PyTorch Lightning 1.7 is the culmination of work from 106 contributors who have worked on features, bug-fixes, and documentation for a total of over 492 commits since 1.6.0.

Highlights

Apple Silicon Support

For those using PyTorch 1.12 on M1 or M2 Apple machines, we have created the MPSAccelerator. MPSAccelerator enables accelerated GPU training on Apple’s Metal Performance Shaders (MPS) as a backend process.


NOTE

Support for this accelerator is currently marked as experimental in PyTorch. Because many operators are still missing, you may run into a few rough edges.


# Selects the accelerator
trainer = pl.Trainer(accelerator="mps")
Equivalent to
from pytorch_lightning.accelerators import MPSAccelerator
trainer = pl.Trainer(accelerator=MPSAccelerator())
Defaults to "mps" when run on M1 or M2 Apple machines
to avoid code changes when switching computers
trainer = pl.Trainer(accelerator="gpu")

Native Fully Sharded Data Parallel Strategy

PyTorch 1.12 also added native support for Fully Sharded Data Parallel (FSDP). Previously, PyTorch Lightning enabled this by using the fairscale project. You can now choose between both options.


NOTE

Support for this strategy is marked as beta in PyTorch.


</tr></table> 

... (truncated)

Commits

Dependabot compatibility score

Dependabot will resolve any conflicts with this PR as long as you don't alter it yourself. You can also trigger a rebase manually by commenting @dependabot rebase.


Dependabot commands and options

You can trigger Dependabot actions by commenting on this PR:

  • @dependabot rebase will rebase this PR
  • @dependabot recreate will recreate this PR, overwriting any edits that have been made to it
  • @dependabot merge will merge this PR after your CI passes on it
  • @dependabot squash and merge will squash and merge this PR after your CI passes on it
  • @dependabot cancel merge will cancel a previously requested merge and block automerging
  • @dependabot reopen will reopen this PR if it is closed
  • @dependabot close will close this PR and stop Dependabot recreating it. You can achieve the same result by closing it manually
  • @dependabot ignore this major version will close this PR and stop Dependabot creating any more for this major version (unless you reopen the PR or upgrade to it yourself)
  • @dependabot ignore this minor version will close this PR and stop Dependabot creating any more for this minor version (unless you reopen the PR or upgrade to it yourself)
  • @dependabot ignore this dependency will close this PR and stop Dependabot creating any more for this dependency (unless you reopen the PR or upgrade to it yourself)

Bumps [pytorch-lightning](https://github.com/Lightning-AI/lightning) from 1.6.5 to 1.7.0.
- [Release notes](https://github.com/Lightning-AI/lightning/releases)
- [Commits](Lightning-AI/pytorch-lightning@1.6.5...pl/1.7.0)

---
updated-dependencies:
- dependency-name: pytorch-lightning
  dependency-type: direct:production
  update-type: version-update:semver-minor
...

Signed-off-by: dependabot[bot] <support@github.com>
@dependabot dependabot bot added the dependencies Packaging and dependencies label Aug 2, 2022
@github-actions github-actions bot added the trainers PyTorch Lightning trainers label Aug 2, 2022
@adamjstewart adamjstewart added this to the 0.3.1 milestone Aug 2, 2022
@adamjstewart
Copy link
Collaborator

We should avoid direct imports like:

from pytorch_lightning.core.lightning import LightningModule

This path isn't documented anywhere and was changed to pytorch_lightning.core.module.LightningModule (deprecated in 1.7, removed in 1.9). Instead, we should use:

import pytorch_lightning as pl
...
pl.LightningModule

This should work for every version.

When creating the 0.3.1 release, it should be noted that this PR adds support for pytorch_lightning 1.9+.

@github-actions github-actions bot added the datamodules PyTorch Lightning datamodules label Aug 2, 2022
Comment on lines -92 to -93
progress_bar_refresh_rate=0,
checkpoint_callback=False,
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Both of these parameters have been deprecated since v1.5 and removed in v1.7.

@github-actions github-actions bot added the documentation Improvements or additions to documentation label Aug 2, 2022
@adamjstewart adamjstewart merged commit 044d901 into main Aug 2, 2022
@adamjstewart adamjstewart deleted the dependabot/pip/requirements/pytorch-lightning-1.7.0 branch August 2, 2022 23:39
adamjstewart added a commit that referenced this pull request Sep 3, 2022
* Bump pytorch-lightning from 1.6.5 to 1.7.0 in /requirements

Bumps [pytorch-lightning](https://github.com/Lightning-AI/lightning) from 1.6.5 to 1.7.0.
- [Release notes](https://github.com/Lightning-AI/lightning/releases)
- [Commits](Lightning-AI/pytorch-lightning@1.6.5...pl/1.7.0)

---
updated-dependencies:
- dependency-name: pytorch-lightning
  dependency-type: direct:production
  update-type: version-update:semver-minor
...

Signed-off-by: dependabot[bot] <support@github.com>

* Remove protobuf restrictions

* LightningModule was moved

* Mypy fixes

* Ensure same behavior

* Fix docs

* Silence warnings

* Change error message location

Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
yichiac pushed a commit to yichiac/torchgeo that referenced this pull request Apr 29, 2023
…t#697)

* Bump pytorch-lightning from 1.6.5 to 1.7.0 in /requirements

Bumps [pytorch-lightning](https://github.com/Lightning-AI/lightning) from 1.6.5 to 1.7.0.
- [Release notes](https://github.com/Lightning-AI/lightning/releases)
- [Commits](Lightning-AI/pytorch-lightning@1.6.5...pl/1.7.0)

---
updated-dependencies:
- dependency-name: pytorch-lightning
  dependency-type: direct:production
  update-type: version-update:semver-minor
...

Signed-off-by: dependabot[bot] <support@github.com>

* Remove protobuf restrictions

* LightningModule was moved

* Mypy fixes

* Ensure same behavior

* Fix docs

* Silence warnings

* Change error message location

Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
Co-authored-by: Adam J. Stewart <ajstewart426@gmail.com>
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
datamodules PyTorch Lightning datamodules dependencies Packaging and dependencies documentation Improvements or additions to documentation trainers PyTorch Lightning trainers
Projects
None yet
Development

Successfully merging this pull request may close these issues.

None yet

1 participant