Skip to content

nanxuanzhao/DiLo

main
Switch branches/tags

Name already in use

A tag already exists with the provided branch name. Many Git commands accept both tag and branch names, so creating this branch may cause unexpected behavior. Are you sure you want to create this branch?
Code

Latest commit

 

Git stats

Files

Permalink
Failed to load latest commit information.
Type
Name
Latest commit message
Commit time
 
 

DiLo Code and Pretrained Models

Distilling Localization for Self-Supervised Representation Learning
Nanxuan Zhao* Zhirong Wu* Rynson W.H. Lau Stephen Lin

Dataloader

saliency_dataset.py is an example dataloader for adding the DiLo copy-and-paste augmentation.

The saliency estimation models can be found:

Pretrained Models

Base Model Saliency Estimation Model download
MoCo RBD model
MoCo BASNet model
MoCo v2 RBD model
MoCo v2 BASNet model

Citation

Please cite our paper if you use DiLo in your research or wish to refer to the results published in the paper.

@inproceedings{ZhaoAAAI2021, 
    author = {Nanxuan Zhao and Zhirong Wu and Rynson W.H. Lau and Stephen Lin}, 
    title = {Distilling Localization for Self-Supervised Representation Learning}, 
    booktitle = {Proceedings of the AAAI Conference on Artificial Intelligence}, 
    year = {2021} 
}

About

Code and pertained models for the paper "Distilling Localization for Self-Supervised Representation Learning"

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages