Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Minor Fine-Tuning Scheduler Tutorial Updates for PTL 1.7 #187

Merged
merged 17 commits into from
Aug 15, 2022
Merged
Show file tree
Hide file tree
Changes from 7 commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Jump to
Jump to file
Failed to load files.
Diff view
Diff view
13 changes: 6 additions & 7 deletions lightning_examples/finetuning-scheduler/.meta.yml
Original file line number Diff line number Diff line change
@@ -1,20 +1,19 @@
title: Finetuning Scheduler
title: Fine-Tuning Scheduler
author: "[Dan Dale](https://github.com/speediedan)"
created: 2021-11-29
updated: 2022-06-10
updated: 2022-08-06
license: CC BY-SA
build: 0
tags:
- Finetuning
- Fine-Tuning
description: |
This notebook introduces the [Finetuning Scheduler](https://finetuning-scheduler.readthedocs.io/en/stable/index.html) extension
and demonstrates the use of it to finetune a small foundational model on the
This notebook introduces the [Fine-Tuning Scheduler](https://finetuning-scheduler.readthedocs.io/en/stable/index.html) extension
and demonstrates the use of it to fine-tune a small foundational model on the
[RTE](https://huggingface.co/datasets/viewer/?dataset=super_glue&config=rte) task of
[SuperGLUE](https://super.gluebenchmark.com/) with iterative early-stopping defined according to a user-specified
schedule. It uses Hugging Face's ``datasets`` and ``transformers`` libraries to retrieve the relevant benchmark data
and foundational model weights. The required dependencies are installed via the finetuning-scheduler ``[examples]`` extra.
requirements:
- finetuning-scheduler[examples]
- hydra-core>=1.1.0
- finetuning-scheduler[examples]>=0.2.0
accelerator:
- GPU