Skip to content

Code for SAFT: Self-Attention Factor-Tuning, a 16x more efficient solution for fine-tuning neural networks

License

Notifications You must be signed in to change notification settings

Jaso1024/Self-Attention-Factor-Tuning

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

44 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

SAFT: Self-Attention Factor-Tuning

A highly efficient fine-tuning technique for large-scale neural networks (16x more parameter-efficient training than regular fine-tuning).

Code for the paper Self-Attention Factor-Tuning for Parameter-Efficient Fine-Tuning

PyPI Version Downloads

Table of Contents

Quickstart

Easily install SAFT using pip and get started with a simple example.

Installation

pip install saft

Example Usage

from saft.saft import saft

if __name__ == "__main__":
    saft_instance = saft(
        model='vit_base_patch16_224',
        num_classes=get_classes_num('oxford_flowers102'),
        validation_interval=1,
        rank=3,
        scale=10
    )
    # Replace with your PyTorch DataLoader objects
    # train_dl, test_dl = [your data in a pytorch dataloader]
    # saft_instance.upload_data(train_dl, test_dl)
    
    saft_instance.train(10)
    trained_model = saft_instance.model

VTAB-1k Test

To run tests on the VTAB-1K dataset, follow these steps:

  1. Visit the SSF Data Preparation page to download the VTAB-1K dataset.
  2. Place the downloaded dataset folders in <YOUR PATH>/SAFT/data/.

Pretrained Model

For a quick start, download the pretrained ViT-B/16 model:

Results

Performance Results

Performance Results

Performance Results

About

Code for SAFT: Self-Attention Factor-Tuning, a 16x more efficient solution for fine-tuning neural networks

Topics

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages