Skip to content

Code and pertained models for the paper "Distilling Localization for Self-Supervised Representation Learning"

Notifications You must be signed in to change notification settings

nanxuanzhao/DiLo

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

3 Commits
 
 
 
 

Repository files navigation

DiLo Code and Pretrained Models

Distilling Localization for Self-Supervised Representation Learning
Nanxuan Zhao* Zhirong Wu* Rynson W.H. Lau Stephen Lin

Dataloader

saliency_dataset.py is an example dataloader for adding the DiLo copy-and-paste augmentation.

The saliency estimation models can be found:

Pretrained Models

Base Model Saliency Estimation Model download
MoCo RBD model
MoCo BASNet model
MoCo v2 RBD model
MoCo v2 BASNet model

Citation

Please cite our paper if you use DiLo in your research or wish to refer to the results published in the paper.

@inproceedings{ZhaoAAAI2021, 
    author = {Nanxuan Zhao and Zhirong Wu and Rynson W.H. Lau and Stephen Lin}, 
    title = {Distilling Localization for Self-Supervised Representation Learning}, 
    booktitle = {Proceedings of the AAAI Conference on Artificial Intelligence}, 
    year = {2021} 
}

About

Code and pertained models for the paper "Distilling Localization for Self-Supervised Representation Learning"

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages