Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Fine-Tuning Scheduler Tutorial Updates for Lightning 2.0.x #236

Merged
merged 4 commits into from Mar 17, 2023

Conversation

speediedan
Copy link
Contributor

A few minor updates to the Fine-Tuning Scheduler Tutorial to simplify it slightly and better accommodate the Lightning and PyTorch 2.0.x releases.

Changed

  1. Support for PyTorch and PyTorch Lightning 2.0.0! Note that lightning>=2.0.0 is a dependency of finetuning-scheduler 2.0.0 which this tutorial requires.
  2. Use the unified lightning package rather than the standalone package. Fine-Tuning Scheduler (FTS) by default depends upon the lightning package rather than the standalone pytorch-lightning package beginning with FTS 2.0 (though the latter can still be installed and used similar to Lightning).
  3. Leverage a feature now available in finetuning-scheduler 2.0.0 to simplify the example.
  4. Updated the list of distributed strategies FTS supports as of FTS 2.x.
  5. Further simplify the example by requiring torch>=1.12.1 (to avoid assert not step_t.is_cuda, "If capturable=False, state_steps should not be CUDA tensors. pytorch/pytorch#80809 with torch 1.12.0 and the version checking/special patched version of AdamW that was required for 1.12.0)

Congrats on the successful Lightning 2.0 launch!! 🚀 🎉

Before submitting

  • Was this discussed/approved via a Github issue? (no need for typos and docs improvements)
  • Did you make sure to update the docs?
  • Did you write any new necessary tests?

PR review

Anyone in the community is free to review the PR once the tests have passed.
If we didn't discuss your PR in Github issues there's a high chance it will not be merged.

Did you have fun?

Make sure you had fun coding 🙃

@mergify mergify bot requested a review from Borda March 16, 2023 01:45
@codecov
Copy link

codecov bot commented Mar 16, 2023

Codecov Report

Merging #236 (b448cd7) into main (0d8b103) will not change coverage.
The diff coverage is n/a.

Additional details and impacted files
@@         Coverage Diff         @@
##           main   #236   +/-   ##
===================================
  Coverage    73%    73%           
===================================
  Files         2      2           
  Lines       382    382           
===================================
  Hits        280    280           
  Misses      102    102           

@speediedan speediedan marked this pull request as ready for review March 16, 2023 01:58
@speediedan
Copy link
Contributor Author

@Borda @carmocca @ethanwharris @awaelchli
Congrats on the successful 2.0 launch! Since Fine-Tuning Scheduler 2.0 was released yesterday and by default depends upon the unified lightning package rather than the standalone pytorch-lightning package I'm hoping this PR can be merged before too long. Only bothering you guys since I think it's important for users visiting the 2.0 tutorials to see the updated FTS default usage that aligns with the unified lightning package. Thanks again for all your work!!

Copy link
Member

@awaelchli awaelchli left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Awesome! Thanks a lot for updating

Copy link
Member

@awaelchli awaelchli left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Awesome

@Borda Borda enabled auto-merge (squash) March 17, 2023 23:11
@Borda Borda merged commit 898229b into Lightning-AI:main Mar 17, 2023
@Borda Borda added the enhancement New feature or request label Mar 18, 2023
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
enhancement New feature or request
Projects
None yet
Development

Successfully merging this pull request may close these issues.

None yet

3 participants