Skip to content

Official implementation for "Parameter-Efficient Fine-Tuning Design Spaces"

License

Notifications You must be signed in to change notification settings

amazon-science/peft-design-spaces

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

2 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Parameter-Efficient Fine-Tuning Design Spaces

This is the official implementation of the Parameter-Efficient Fine-Tuning Design Spaces.

Usage

One needs to setup the enrironment before running the experiments.

Setup

cd models
pip install -e .

Experiments

Evaluating the S4-model (PEFT is the alias) with the RoBERTa backbone on GLUE

export TASK_NAME=sst-2

python run_glue.py \
  --model_name_or_path roberta-base-uncased \
  --task_name $TASK_NAME \
  --do_train \
  --do_eval \
  --max_seq_length 128 \
  --per_device_train_batch_size 16 \
  --learning_rate 5e-5 \
  --num_train_epochs 10.0 \
  --output_dir /tmp/$TASK_NAME \
  --overwrite_output_dir \
  --train_adapter \
  --adapter_config PEFT

Acknowledgement

Part of our codes are adapted from adapter-transformers.

License

This project is licensed under the Apache-2.0 License.

About

Official implementation for "Parameter-Efficient Fine-Tuning Design Spaces"

Resources

License

Code of conduct

Security policy

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages