Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Minor Fine-Tuning Scheduler Tutorial Updates for PTL 1.7 #187

Merged
merged 17 commits into from
Aug 15, 2022
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
1 change: 1 addition & 0 deletions .azure/ipynb-publish.yml
Original file line number Diff line number Diff line change
Expand Up @@ -64,6 +64,7 @@ jobs:
set -e
sudo apt-get update -q --fix-missing
sudo apt install -y tree ffmpeg
pip install --upgrade pip
pip --version
pip install --requirement requirements.txt
pip install --requirement requirements/data.txt
Expand Down
1 change: 1 addition & 0 deletions .azure/ipynb-tests.yml
Original file line number Diff line number Diff line change
Expand Up @@ -44,6 +44,7 @@ jobs:
set -e
sudo apt-get update -q --fix-missing
sudo apt install -y tree ffmpeg
pip install --upgrade pip
pip --version
pip install --requirement requirements.txt
pip install --requirement requirements/data.txt
Expand Down
7 changes: 2 additions & 5 deletions lightning_examples/basic-gan/gan.py
Original file line number Diff line number Diff line change
Expand Up @@ -20,7 +20,7 @@
# ### MNIST DataModule
#
# Below, we define a DataModule for the MNIST Dataset. To learn more about DataModules, check out our tutorial
# on them or see the [latest docs](https://pytorch-lightning.readthedocs.io/en/stable/extensions/datamodules.html).
# on them or see the [latest release docs](https://pytorch-lightning.readthedocs.io/en/stable/data/datamodule.html).


# %%
Expand All @@ -43,9 +43,6 @@ def __init__(
]
)

# self.dims is returned when you call dm.size()
# Setting default dims here because we know them.
# Could optionally be assigned dynamically in dm.setup()
self.dims = (1, 28, 28)
self.num_classes = 10

Expand Down Expand Up @@ -248,7 +245,7 @@ def on_validation_epoch_end(self):

# %%
dm = MNISTDataModule()
model = GAN(*dm.size())
model = GAN(*dm.dims)
trainer = Trainer(
accelerator="auto",
devices=1 if torch.cuda.is_available() else None, # limiting got iPython runs
Expand Down
2 changes: 1 addition & 1 deletion lightning_examples/datamodules/.meta.yml
Original file line number Diff line number Diff line change
Expand Up @@ -8,7 +8,7 @@ description: This notebook will walk you through how to start using Datamodules.
the release of `pytorch-lightning` version 0.9.0, we have included a new class called
`LightningDataModule` to help you decouple data related hooks from your `LightningModule`.
The most up-to-date documentation on datamodules can be found
[here](https://pytorch-lightning.readthedocs.io/en/stable/extensions/datamodules.html).
[here](https://pytorch-lightning.readthedocs.io/en/stable/data/datamodule.html).
requirements:
- torchvision
accelerator:
Expand Down
9 changes: 3 additions & 6 deletions lightning_examples/datamodules/datamodules.py
Original file line number Diff line number Diff line change
Expand Up @@ -146,7 +146,7 @@ def test_dataloader(self):
# 1. ```__init__```
# - Takes in a `data_dir` arg that points to where you have downloaded/wish to download the MNIST dataset.
# - Defines a transform that will be applied across train, val, and test dataset splits.
# - Defines default `self.dims`, which is a tuple returned from `datamodule.size()` that can help you initialize models.
# - Defines default `self.dims`.
#
#
# 2. ```prepare_data```
Expand Down Expand Up @@ -176,9 +176,6 @@ def __init__(self, data_dir: str = PATH_DATASETS):
]
)

# self.dims is returned when you call dm.size()
# Setting default dims here because we know them.
# Could optionally be assigned dynamically in dm.setup()
self.dims = (1, 28, 28)
self.num_classes = 10

Expand Down Expand Up @@ -274,7 +271,7 @@ def configure_optimizers(self):
# Init DataModule
dm = MNISTDataModule()
# Init model from datamodule's attributes
model = LitModel(*dm.size(), dm.num_classes)
model = LitModel(*dm.dims, dm.num_classes)
# Init trainer
trainer = Trainer(
max_epochs=3,
Expand Down Expand Up @@ -341,7 +338,7 @@ def test_dataloader(self):

# %%
dm = CIFAR10DataModule()
model = LitModel(*dm.size(), dm.num_classes, hidden_size=256)
model = LitModel(*dm.dims, dm.num_classes, hidden_size=256)
tqdm_progress_bar = TQDMProgressBar(refresh_rate=20)
trainer = Trainer(
max_epochs=5,
Expand Down
13 changes: 6 additions & 7 deletions lightning_examples/finetuning-scheduler/.meta.yml
Original file line number Diff line number Diff line change
@@ -1,20 +1,19 @@
title: Finetuning Scheduler
title: Fine-Tuning Scheduler
author: "[Dan Dale](https://github.com/speediedan)"
created: 2021-11-29
updated: 2022-06-10
updated: 2022-08-06
license: CC BY-SA
build: 0
tags:
- Finetuning
- Fine-Tuning
description: |
This notebook introduces the [Finetuning Scheduler](https://finetuning-scheduler.readthedocs.io/en/stable/index.html) extension
and demonstrates the use of it to finetune a small foundational model on the
This notebook introduces the [Fine-Tuning Scheduler](https://finetuning-scheduler.readthedocs.io/en/stable/index.html) extension
and demonstrates the use of it to fine-tune a small foundational model on the
[RTE](https://huggingface.co/datasets/viewer/?dataset=super_glue&config=rte) task of
[SuperGLUE](https://super.gluebenchmark.com/) with iterative early-stopping defined according to a user-specified
schedule. It uses Hugging Face's ``datasets`` and ``transformers`` libraries to retrieve the relevant benchmark data
and foundational model weights. The required dependencies are installed via the finetuning-scheduler ``[examples]`` extra.
requirements:
- finetuning-scheduler[examples]
- hydra-core>=1.1.0
- finetuning-scheduler[examples]>=0.2.0
accelerator:
- GPU
188 changes: 88 additions & 100 deletions lightning_examples/finetuning-scheduler/finetuning-scheduler.py

Large diffs are not rendered by default.

Binary file modified lightning_examples/finetuning-scheduler/logo_fts.png
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
2 changes: 1 addition & 1 deletion lightning_examples/mnist-hello-world/hello-world.py
Original file line number Diff line number Diff line change
Expand Up @@ -94,7 +94,7 @@ def configure_optimizers(self):
# - If you don't mind loading all your datasets at once, you can set up a condition to allow for both 'fit' related setup and 'test' related setup to run whenever `None` is passed to `stage` (or ignore it altogether and exclude any conditionals).
# - **Note this runs across all GPUs and it *is* safe to make state assignments here**
#
# 3. [x_dataloader()](https://pytorch-lightning.readthedocs.io/en/stable/api_references.html#core-api) ♻️
# 3. [x_dataloader()](https://pytorch-lightning.readthedocs.io/en/stable/api/pytorch_lightning.core.hooks.DataHooks.html#pytorch_lightning.core.hooks.DataHooks.train_dataloader) ♻️
# - `train_dataloader()`, `val_dataloader()`, and `test_dataloader()` all return PyTorch `DataLoader` instances that are created by wrapping their respective datasets that we prepared in `setup()`


Expand Down
9 changes: 3 additions & 6 deletions lightning_examples/mnist-tpu-training/mnist-tpu.py
Original file line number Diff line number Diff line change
Expand Up @@ -23,7 +23,7 @@
# ### Defining The `MNISTDataModule`
#
# Below we define `MNISTDataModule`. You can learn more about datamodules
# in [docs](https://pytorch-lightning.readthedocs.io/en/stable/extensions/datamodules.html).
# in [docs](https://pytorch-lightning.readthedocs.io/en/stable/data/datamodule.html).


# %%
Expand All @@ -33,9 +33,6 @@ def __init__(self, data_dir: str = "./"):
self.data_dir = data_dir
self.transform = transforms.Compose([transforms.ToTensor(), transforms.Normalize((0.1307,), (0.3081,))])

# self.dims is returned when you call dm.size()
# Setting default dims here because we know them.
# Could optionally be assigned dynamically in dm.setup()
self.dims = (1, 28, 28)
self.num_classes = 10

Expand Down Expand Up @@ -151,7 +148,7 @@ def configure_optimizers(self):
# Init DataModule
dm = MNISTDataModule()
# Init model from datamodule's attributes
model = LitModel(*dm.size(), dm.num_classes)
model = LitModel(*dm.dims, dm.num_classes)
# Init trainer
trainer = Trainer(
max_epochs=3,
Expand All @@ -170,7 +167,7 @@ def configure_optimizers(self):
# Init DataModule
dm = MNISTDataModule()
# Init model from datamodule's attributes
model = LitModel(*dm.size(), dm.num_classes)
model = LitModel(*dm.dims, dm.num_classes)
# Init trainer
trainer = Trainer(
max_epochs=3,
Expand Down