Skip to content
/ SDD_ViT Public

[ICIAP23] Sparse Double Descent in Vision Transformers: real or phantom threat?

License

Notifications You must be signed in to change notification settings

VGCQ/SDD_ViT

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

15 Commits
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

Sparse Double Descent in Vision Transformers: real or phantom threat?

arXiv DOI

This GitHub implements the key experiments of the following paper : Sparse Double Descent in Vision Transformers: real or phantom threat?.

Occurrence of Sparse Double Descent in Vision Transformers?

teaser

Figure: Test accuracy of ViT on (Left.) CIFAR-10 and (Right.) CIFAR-100 with different amount of label noise $\varepsilon$.

Libraries

  • Python = 3.10
  • PyTorch = 1.13
  • Torchvision = 0.14
  • Numpy = 1.23

Usage

In practice, you can begin with a set of defaults and optionally modify individual hyperparameters as desired. To view the hyperparameters for each subcommand, use the following command.

main.py [subcommand] [...] --help

Example Runs

To run a ViT on CIFAR-10 with 10% of label noise, batch size of 512, learning rate of 1e-4, weight decay of 0.03 for 200 epochs: python main.py

To run a ResNet-18 on CIFAR-100 with 20% of label noise, batch size of 128, learning rate of 0.1, and weight decay of 1e-4 for 160 epochs: python main.py --model='resnet-18' --num_classes=100 --amount_noise=0.2 --batch_size=128 --learning_rate=0.1 --weight_decay=1e-4 --epochs=160

Citation

If you find this useful for your research, please cite the following paper.

@inproceedings{quetu2023sparse,
  title={Sparse Double Descent in Vision Transformers: real or phantom threat?},
  author={Qu{\'e}tu, Victor and Milovanovi{\'c}, Marta and Tartaglione, Enzo},
  booktitle={International Conference on Image Analysis and Processing},
  pages={490--502},
  year={2023},
  organization={Springer}
}

About

[ICIAP23] Sparse Double Descent in Vision Transformers: real or phantom threat?

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages