Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Minor Fine-Tuning Scheduler Tutorial Updates for PTL 1.7 #187

Merged
merged 17 commits into from
Aug 15, 2022
Merged
Show file tree
Hide file tree
Changes from 11 commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Jump to
Jump to file
Failed to load files.
Diff view
Diff view
7 changes: 2 additions & 5 deletions lightning_examples/basic-gan/gan.py
Original file line number Diff line number Diff line change
Expand Up @@ -20,7 +20,7 @@
# ### MNIST DataModule
#
# Below, we define a DataModule for the MNIST Dataset. To learn more about DataModules, check out our tutorial
# on them or see the [latest docs](https://pytorch-lightning.readthedocs.io/en/stable/extensions/datamodules.html).
# on them or see the [latest release docs](https://pytorch-lightning.readthedocs.io/en/stable/data/datamodules.html).
rohitgr7 marked this conversation as resolved.
Show resolved Hide resolved


# %%
Expand All @@ -43,9 +43,6 @@ def __init__(
]
)

# self.dims is returned when you call dm.size()
# Setting default dims here because we know them.
# Could optionally be assigned dynamically in dm.setup()
self.dims = (1, 28, 28)
self.num_classes = 10

Expand Down Expand Up @@ -248,7 +245,7 @@ def on_validation_epoch_end(self):

# %%
dm = MNISTDataModule()
model = GAN(*dm.size())
model = GAN(*dm.dims)
trainer = Trainer(
accelerator="auto",
devices=1 if torch.cuda.is_available() else None, # limiting got iPython runs
Expand Down
13 changes: 6 additions & 7 deletions lightning_examples/finetuning-scheduler/.meta.yml
Original file line number Diff line number Diff line change
@@ -1,20 +1,19 @@
title: Finetuning Scheduler
title: Fine-Tuning Scheduler
author: "[Dan Dale](https://github.com/speediedan)"
created: 2021-11-29
updated: 2022-06-10
updated: 2022-08-06
license: CC BY-SA
build: 0
tags:
- Finetuning
- Fine-Tuning
description: |
This notebook introduces the [Finetuning Scheduler](https://finetuning-scheduler.readthedocs.io/en/stable/index.html) extension
and demonstrates the use of it to finetune a small foundational model on the
This notebook introduces the [Fine-Tuning Scheduler](https://finetuning-scheduler.readthedocs.io/en/stable/index.html) extension
and demonstrates the use of it to fine-tune a small foundational model on the
[RTE](https://huggingface.co/datasets/viewer/?dataset=super_glue&config=rte) task of
[SuperGLUE](https://super.gluebenchmark.com/) with iterative early-stopping defined according to a user-specified
schedule. It uses Hugging Face's ``datasets`` and ``transformers`` libraries to retrieve the relevant benchmark data
and foundational model weights. The required dependencies are installed via the finetuning-scheduler ``[examples]`` extra.
requirements:
- finetuning-scheduler[examples]
- hydra-core>=1.1.0
- finetuning-scheduler[examples]>=0.2.0
accelerator:
- GPU