Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Enhancement Request: Make toggling between FTS standalone (pytorch-lightning) and unified (lightning) dependent versions easier #10

Open
jnyjxn opened this issue Jun 14, 2023 · 1 comment
Labels
enhancement New feature or request
Milestone

Comments

@jnyjxn
Copy link

jnyjxn commented Jun 14, 2023

🐛 Bug

I am trying to integrate finetuning-scheduler with my code which is built around pytorch_lightning and am having import errors in a way that seems to be essentially the mirror of #8.

This is having the result that when trying to specify finetuning callbacks in the config as per the example, the following error is observed:

main.py: error: Parser key "trainer.callbacks":
  Does not validate against any of the Union subtypes
  Subtypes: (typing.List[pytorch_lightning.callbacks.callback.Callback], <class 'pytorch_lightning.callbacks.callback.Callback'>, <class 'NoneType'>)
  Errors:
    - Expected a <class 'list'>
    - Import path finetuning_scheduler.FinetuningScheduler does not correspond to a subclass of <class 'pytorch_lightning.callbacks.callback.Callback'>
    - Expected a <class 'NoneType'>
  Given value type: <class 'dict'>
  Given value: {'class_path': 'finetuning_scheduler.FinetuningScheduler'}

To Reproduce

import pytorch_lightning as pl
issubclass(finetuning_scheduler.FinetuningScheduler, pl.Callback)
>>> False
issubclass(finetuning_scheduler.FTSCheckpoint, pl.Callback)
>>> False
issubclass(finetuning_scheduler.FTSEarlyStopping, pl.Callback)
>>> False

import lightning.pytorch as pl
issubclass(finetuning_scheduler.FinetuningScheduler, pl.Callback)
>>> True
issubclass(finetuning_scheduler.FTSCheckpoint, pl.Callback)
>>> True
issubclass(finetuning_scheduler.FTSEarlyStopping, pl.Callback)
>>> True

Expected behavior

import pytorch_lightning as pl
issubclass(finetuning_scheduler.FinetuningScheduler, pl.Callback)
>>> True
issubclass(finetuning_scheduler.FTSCheckpoint, pl.Callback)
>>> True
issubclass(finetuning_scheduler.FTSEarlyStopping, pl.Callback)
>>> True

import lightning.pytorch as pl
issubclass(finetuning_scheduler.FinetuningScheduler, pl.Callback)
>>> True
issubclass(finetuning_scheduler.FTSCheckpoint, pl.Callback)
>>> True
issubclass(finetuning_scheduler.FTSEarlyStopping, pl.Callback)
>>> True

Environment

  • CUDA:
    • GPU:
    • available: False
    • version: None
  • Packages:
    • finetuning-scheduler: 2.0.2
    • numpy: 1.22.4
    • pyTorch_debug: False
    • pyTorch_version: 2.0.0
    • pytorch-lightning: 2.0.0
    • tqdm: 4.65.0
  • System:
    • OS: Darwin
    • architecture:
      • 64bit
    • processor: arm
    • python: 3.10.3
    • version: Darwin Kernel Version 22.5.0: Mon Apr 24 20:52:24 PDT 2023; root:xnu-8796.121.2~5/RELEASE_ARM64_T6000
@jnyjxn jnyjxn added the bug Something isn't working label Jun 14, 2023
@speediedan
Copy link
Owner

Happy to help @jnyjxn, apologies for the slow response, I have a lot on my plate at the moment.

Now that the core Lightning package is lightning rather than pytorch-lightning (starting with version >= 2.0), Fine-Tuning Scheduler (FTS) by default depends upon the lightning package rather than the standalone pytorch-lightning one. If you would like to continue to use FTS with the standalone pytorch-lightning package instead, you can still do by installing FTS accordingly.

For instance, starting with a given PyTorch-only base environment:

# pytorch base env
cd /tmp
conda deactivate
conda remove --name fts_standalone --all -y
conda update -n base -c defaults conda -y 
conda create -n fts_standalone pytorch torchvision pytorch-cuda=11.8 python=3.10 pip -c pytorch -c nvidia -y
conda activate fts_standalone

One can install standalone pytorch-lightning, lightning or both, and then use finetuning-scheduler with either as desired.1

# install `lightning` which allows us to test with both `pytorch-lightning` standalone and `lightning`
pip install lightning
# install desired version of FTS standalone
export FTS_VERSION=2.0.4
export PACKAGE_NAME=pytorch
wget https://github.com/speediedan/finetuning-scheduler/releases/download/v${FTS_VERSION}/finetuning-scheduler-${FTS_VERSION}.tar.gz
pip install finetuning-scheduler-${FTS_VERSION}.tar.gz

We can then validate that FTS can use pytorch-lightning standalone but not lightning:

python -c "
import pytorch_lightning as pl
import lightning as l
import finetuning_scheduler as fts
print(tuple(issubclass(fts_cls, pl.Callback) for fts_cls in (fts.FinetuningScheduler, fts.FTSCheckpoint, fts.FTSEarlyStopping)))  # (True, True, True)
print(tuple(issubclass(fts_cls, l.Callback) for fts_cls in (fts.FinetuningScheduler, fts.FTSCheckpoint, fts.FTSEarlyStopping))) # (False, False, False)"

If there is sufficient community interest I may consider the following ways of enhancing FTS's support for the pytorch-lightning standalone version:

  1. Adding a script that allows switching between Lightning versions without reinstalling FTS by rewriting/updating FTS code in place (Con: not as clean as installing the relevant version of FTS via archive or source)
  2. Adding a second package of finetuning-scheduler built against pytorch-lightning standalone to allow pip installation without a specific archive (Con: requires maintaining a separate pypi package, lots of additional CI and overhead)

Hope this helps! I'm going to change the label and title of this issue to reflect a request for a future enhancement if that's okay with you. Feel free to reach out anytime if you have further questions.

Footnotes

  1. Unfortunately, one can only use FTS with one version of lightning/pytorch-lightning at a time since FTS needs be installed to align (as you observed) with a particular target version of Lightning.

@speediedan speediedan changed the title Use with pytorch_lightning not supported Enhancement Request: Make toggling between FTS standalone (pytorch-lightning) and unified (lightning) dependent versions easier Jun 23, 2023
@speediedan speediedan added enhancement New feature or request and removed bug Something isn't working labels Jun 23, 2023
@speediedan speediedan added this to the future milestone Jun 23, 2023
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
enhancement New feature or request
Projects
None yet
Development

No branches or pull requests

2 participants