Skip to content

wizard1203/Good_Transfer

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

7 Commits
 
 
 
 

Repository files navigation

What Makes Instance Discrimination Good for Transfer Learning?

What Makes Instance Discrimination Good for Transfer Learning?
Nanxuan Zhao* Zhirong Wu* Rynson W.H. Lau Stephen Lin

Pretrained Models

Different data augmentations for learning self-supervised and supervised representations (Table 1).
Pretraining Pytorch Augmentation Download
Unsupervised + RandomHorizontalFlip(0.5) model
+ RandomResizedCrop(224) model
+ ColorJitter(0.4, 0.4, 0.4, 0.1) model
+ RandomGrayscale(p=0.2) model
+ GaussianBlur(0.1, 0.2) model
supervised + RandomHorizontalFlip(0.5) model
+ RandomResizedCrop(224) model
+ ColorJitter(0.4, 0.4, 0.4, 0.1) model
+ RandomGrayscale(p=0.2) model
+ GaussianBlur(0.1, 0.2) model
Transfer performance with pretraining on various datasets (Table 2).
Pretraining Pretraining Data Download
Unsupervised ImageNet model
ImageNet-10% model
ImageNet-100 model
Places model
CelebA model
COCO model
Synthia model
Supervised ImageNet model
ImageNet-10% model
ImageNet-100 model
Places model
CelebA model
COCO model
Synthia model
Exemplar-based supervised pretraining (Table 3).
Model Download
Exemplar v1 model
Exemplar v2 model

Citation

If you use this work in your research, please cite:

@inproceedings{ZhaoICLR2021, 
    author = {Nanxuan Zhao and Zhirong Wu and Rynson W.H. Lau and Stephen Lin}, 
    title = {What Makes Instance Discrimination Good for Transfer Learning?}, 
    booktitle = {ICLR}, 
    year = {2021} 
}

About

No description, website, or topics provided.

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages