Skip to content
/ PePR Public

Official source code for "Equity through Access: A Case for Small-scale Deep Learning" (Selvan et al.2024)

License

Notifications You must be signed in to change notification settings

saintslab/PePR

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

11 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

PepR Score

License arXiv

Abstract

The recent advances in deep learning (DL) have been accelerated by access to large-scale data and compute. These large-scale resources have been used to train progressively larger models which are resource intensive in terms of compute, data, energy, and carbon emissions. These costs are becoming a new type of entry barrier to researchers and practitioners with limited access to resources at such scale, particularly in the Global South. In this work, we take a comprehensive look at the landscape of existing DL models for vision tasks and demonstrate their usefulness in settings where resources are limited. To account for the resource consumption of DL models, we introduce a novel measure to estimate the performance per resource unit, which we call the PePR score. Using a diverse family of 131 unique DL architectures (spanning $1M$ to $130M$ trainable parameters) and three medical image datasets, we capture trends about the performance-resource trade-offs. In applications like medical image analysis, we argue that small-scale, specialized models are better than striving for large-scale models. Furthermore, we show that using pretrained models can significantly reduce the computational resources and data required. We hope this work will encourage the community to focus on improving AI equity by developing methods and models with smaller resource footprints.

Citation

Kindly use the following BibTeX entry if you use the code in your work.

@article{selvan2024pepr,
 	title={Equity through Access: A Case for Small-scale Deep Learning},
	author={Raghavendra Selvan, Bob Pepin, Christian Igel, Gabrielle Samuel, Erik B Dam}
 	journal={Arxiv},
	year={2024}}

Requirements

  • Standard Pytorch requirements to train the models.
  • TIMM library for using the specific architectures.

Example Usage

Recreate paper plots.

python paper_plot.py

About

Official source code for "Equity through Access: A Case for Small-scale Deep Learning" (Selvan et al.2024)

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published