Minimal example scripts of the Hugging Face Trainer, focused on staying under 150 lines. (and still be readable), and being runnable under 10 minutes.
All examples are based on the Tasks documentation from transformers
.
this collection contains a variety of task scrips for Hugging Face transformers
, but kept to a minimal level
to enhance readability, hackability, and for learning.
Each script is self-contained and requires no arguments. Simply:
git clone https://github.com/muellerzr/minimal-trainer-zoo
cd minimal-trainer-zoo; pip install -r requirements.txt
python {script.py}
(ortorchrun
/accelerate launch
instead ofpython
if wanting to use DDP)
Note that each script is tuned for a Tesla T4 GPU to maximize the memory (and efficiency)