Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Upgrade lightning to 2.0 #3419

Merged
merged 11 commits into from Aug 1, 2023
Merged

Conversation

zhiqiangdon
Copy link
Contributor

@zhiqiangdon zhiqiangdon commented Jul 25, 2023

Issue #, if available:

Description of changes:

TODOs:

  • replace dp strategy in inference with ddp or ddp_spawn since dp is not supported in lightning 2.0.

By submitting this pull request, I confirm that you can use, modify, copy, and redistribute this contribution, under the terms of your choice.

@review-notebook-app
Copy link

Check out this pull request on  ReviewNB

See visual diffs & provide feedback on Jupyter Notebooks.


Powered by ReviewNB

@zhiqiangdon zhiqiangdon added the model list checked You have updated the model list after modifying multimodal unit tests/docs label Jul 25, 2023
@zhiqiangdon zhiqiangdon changed the title [AutoMM] Upgrade lightning to 2.0 Upgrade lightning to 2.0 Jul 25, 2023
@github-actions
Copy link

Job PR-3419-05d0c20 is done.
Docs are uploaded to http://autogluon-staging.s3-website-us-west-2.amazonaws.com/PR-3419/05d0c20/index.html

@github-actions
Copy link

Job PR-3419-059fd91 is done.
Docs are uploaded to http://autogluon-staging.s3-website-us-west-2.amazonaws.com/PR-3419/059fd91/index.html

@@ -30,7 +30,7 @@
"torch", # version range defined in `core/_setup_utils.py`
"statsmodels>=0.13.0,<0.15",
"gluonts>=0.13.1,<0.14",
"pytorch-lightning>=1.7.4,<1.10.0",
"pytorch-lightning>=2.0.0,<2.1",
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Changes to timeseries look good to me, I validated that everything works as expected locally with pytorch-lightning v2.0.

Feel free to merge once the changes to multimodal are approved.

@@ -5,6 +5,7 @@
import torch
import torch.nn.functional as F
import torchmetrics
from pytorch_lightning.utilities import grad_norm
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Should we at some point consider switching to the new API (as described here) or should we wait until the deprecation notice is released for the old API?

import lightning as L

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Good call. We can switch to the new namespace when it becomes more stable.

Copy link
Collaborator

@yinweisu yinweisu left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Changes to _setup_utils look good to me

@github-actions
Copy link

Job PR-3419-7acac86 is done.
Docs are uploaded to http://autogluon-staging.s3-website-us-west-2.amazonaws.com/PR-3419/7acac86/index.html

Copy link
Collaborator

@tonyhoo tonyhoo left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

LGTM

@github-actions
Copy link

github-actions bot commented Aug 1, 2023

Job PR-3419-87c9101 is done.
Docs are uploaded to http://autogluon-staging.s3-website-us-west-2.amazonaws.com/PR-3419/87c9101/index.html

@zhiqiangdon zhiqiangdon merged commit 90ab2ac into autogluon:master Aug 1, 2023
28 checks passed
@zhiqiangdon zhiqiangdon deleted the mm-lightning branch August 1, 2023 23:01
ddelange added a commit to ddelange/autogluon that referenced this pull request Aug 3, 2023
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
model list checked You have updated the model list after modifying multimodal unit tests/docs
Projects
None yet
Development

Successfully merging this pull request may close these issues.

None yet

4 participants