Skip to content
This repository was archived by the owner on Aug 28, 2025. It is now read-only.
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
1 change: 1 addition & 0 deletions .azure/ipynb-publish.yml
Original file line number Diff line number Diff line change
Expand Up @@ -62,6 +62,7 @@ jobs:

- bash: |
set -e
sudo apt-get update -q --fix-missing
sudo apt install -y tree ffmpeg
pip --version
pip install --requirement requirements.txt
Expand Down
1 change: 1 addition & 0 deletions .azure/ipynb-tests.yml
Original file line number Diff line number Diff line change
Expand Up @@ -42,6 +42,7 @@ jobs:

- bash: |
set -e
sudo apt-get update -q --fix-missing
sudo apt install -y tree ffmpeg
pip --version
pip install --requirement requirements.txt
Expand Down
1 change: 1 addition & 0 deletions .github/workflows/ci_testing.yml
Original file line number Diff line number Diff line change
Expand Up @@ -39,6 +39,7 @@ jobs:

- name: Install dependencies
run: |
sudo apt-get update -q --fix-missing
sudo apt install -y ffmpeg
pip --version
pip install --requirement requirements.txt --find-links https://download.pytorch.org/whl/cpu/torch_stable.html
Expand Down
2 changes: 1 addition & 1 deletion docs/source/conf.py
Original file line number Diff line number Diff line change
Expand Up @@ -99,7 +99,7 @@
#
# This is also used if you do content translation via gettext catalogs.
# Usually you set "language" from the command line for these cases.
language = None
language = "en"

# List of patterns, relative to source directory, that match files and
# directories to ignore when looking for source files.
Expand Down
1 change: 1 addition & 0 deletions lightning_examples/finetuning-scheduler/.meta.yml
Original file line number Diff line number Diff line change
Expand Up @@ -15,5 +15,6 @@ description: |
and foundational model weights. The required dependencies are installed via the finetuning-scheduler ``[examples]`` extra.
requirements:
- finetuning-scheduler[examples]
- hydra-core>=1.1.0
accelerator:
- GPU
Original file line number Diff line number Diff line change
Expand Up @@ -59,7 +59,7 @@
# %% [markdown]
# ## The Default Finetuning Schedule
#
# Schedule definition is facilitated via the [gen_ft_schedule](https://finetuning-scheduler.readthedocs.io/en/stable/api/finetuning_scheduler.fts_supporters.html#finetuning_scheduler.fts_supporters.SchedulingMixin.gen_ft_schedule) method which dumps a default finetuning schedule (by default using a naive, 2-parameters per level heuristic) which can be adjusted as
# Schedule definition is facilitated via the [gen_ft_schedule](https://finetuning-scheduler.readthedocs.io/en/stable/api/finetuning_scheduler.fts_supporters.html#finetuning_scheduler.fts_supporters.ScheduleImplMixin.gen_ft_schedule) method which dumps a default finetuning schedule (by default using a naive, 2-parameters per level heuristic) which can be adjusted as
# desired by the user and/or subsequently passed to the callback. Using the default/implicitly generated schedule will likely be less computationally efficient than a user-defined finetuning schedule but is useful for exploring a model's finetuning behavior and can serve as a good baseline for subsequent explicit schedule refinement.
# While the current version of [FinetuningScheduler](https://finetuning-scheduler.readthedocs.io/en/stable/api/finetuning_scheduler.fts.html#finetuning_scheduler.fts.FinetuningScheduler) only supports single optimizer and (optional) lr_scheduler configurations, per-phase maximum learning rates can be set as demonstrated in the next section.

Expand Down