Skip to content

[NeurIPS 2019] Learning Imbalanced Datasets with Label-Distribution-Aware Margin Loss

License

Notifications You must be signed in to change notification settings

kaidic/LDAM-DRW

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

7 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

Learning Imbalanced Datasets with Label-Distribution-Aware Margin Loss

Kaidi Cao, Colin Wei, Adrien Gaidon, Nikos Arechiga, Tengyu Ma


This is the official implementation of LDAM-DRW in the paper Learning Imbalanced Datasets with Label-Distribution-Aware Margin Loss in PyTorch.

Dependency

The code is built with following libraries:

Dataset

  • Imbalanced CIFAR. The original data will be downloaded and converted by imbalancec_cifar.py.
  • The paper also reports results on Tiny ImageNet and iNaturalist 2018. We will update the code for those datasets later.

Training

We provide several training examples with this repo:

  • To train the ERM baseline on long-tailed imbalance with ratio of 100
python cifar_train.py --gpu 0 --imb_type exp --imb_factor 0.01 --loss_type CE --train_rule None
  • To train the LDAM Loss along with DRW training on long-tailed imbalance with ratio of 100
python cifar_train.py --gpu 0 --imb_type exp --imb_factor 0.01 --loss_type LDAM --train_rule DRW

Reference

If you find our paper and repo useful, please cite as

@inproceedings{cao2019learning,
  title={Learning Imbalanced Datasets with Label-Distribution-Aware Margin Loss},
  author={Cao, Kaidi and Wei, Colin and Gaidon, Adrien and Arechiga, Nikos and Ma, Tengyu},
  booktitle={Advances in Neural Information Processing Systems},
  year={2019}
}

About

[NeurIPS 2019] Learning Imbalanced Datasets with Label-Distribution-Aware Margin Loss

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages