Skip to content

nanxuanzhao/Good_transfer

main
Switch branches/tags

Name already in use

A tag already exists with the provided branch name. Many Git commands accept both tag and branch names, so creating this branch may cause unexpected behavior. Are you sure you want to create this branch?
Code

Latest commit

 

Git stats

Files

Permalink
Failed to load latest commit information.
Type
Name
Latest commit message
Commit time
 
 

What Makes Instance Discrimination Good for Transfer Learning?

What Makes Instance Discrimination Good for Transfer Learning?
Nanxuan Zhao* Zhirong Wu* Rynson W.H. Lau Stephen Lin

Pretrained Models

Different data augmentations for learning self-supervised and supervised representations (Table 1).
Pretraining Pytorch Augmentation Download
Unsupervised + RandomHorizontalFlip(0.5) model
+ RandomResizedCrop(224) model
+ ColorJitter(0.4, 0.4, 0.4, 0.1) model
+ RandomGrayscale(p=0.2) model
+ GaussianBlur(0.1, 0.2) model
supervised + RandomHorizontalFlip(0.5) model
+ RandomResizedCrop(224) model
+ ColorJitter(0.4, 0.4, 0.4, 0.1) model
+ RandomGrayscale(p=0.2) model
+ GaussianBlur(0.1, 0.2) model
Transfer performance with pretraining on various datasets (Table 2).
Pretraining Pretraining Data Download
Unsupervised ImageNet model
ImageNet-10% model
ImageNet-100 model
Places model
CelebA model
COCO model
Synthia model
Supervised ImageNet model
ImageNet-10% model
ImageNet-100 model
Places model
CelebA model
COCO model
Synthia model
Exemplar-based supervised pretraining (Table 3).
Model Download
Exemplar v1 model
Exemplar v2 model

Citation

If you use this work in your research, please cite:

@inproceedings{ZhaoICLR2021, 
    author = {Nanxuan Zhao and Zhirong Wu and Rynson W.H. Lau and Stephen Lin}, 
    title = {What Makes Instance Discrimination Good for Transfer Learning?}, 
    booktitle = {ICLR}, 
    year = {2021} 
}

About

Pretrained models for "What Makes Instance Discrimination Good for Transfer Learning?".

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages