Skip to content

Official PyTorch Repository of "Tailoring Self-Supervision for Supervised Learning" (ECCV 2022 Paper)

License

Notifications You must be signed in to change notification settings

wjun0830/Localizable-Rotation

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

10 Commits
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

Tailoring Self-Supervision for Supervised Learning

[Arxiv] | [Paper] | [Video]

Official PyTorch Repository of "Tailoring Self-Supervision for Supervised Learning" (ECCV 2022 Paper)

Abstract. Recently, it is shown that deploying a proper self-supervision is a prospective way to enhance the performance of supervised learning. Yet, the benefits of self-supervision are not fully exploited as previous pretext tasks are specialized for unsupervised representation learning. To this end, we begin by presenting three desirable properties for such auxiliary tasks to assist the supervised objective. First, the tasks need to guide the model to learn rich features. Second, the transformations involved in the self-supervision should not significantly alter the training distribution. Third, the tasks are preferred to be light and generic for high applicability to prior arts. Subsequently, to show how existing pretext tasks can fulfill these and be tailored for supervised learning, we propose a simple auxiliary self-supervision task, predicting localizable rotation (LoRot). Our exhaustive experiments validate the merits of LoRot as a pretext task tailored for supervised learning in terms of robustness and generalization capability.

Cite LoRot (Tailoring Self-Supervision for Supervised Learning)

If you find this repository useful, please use the following entry for citation.

@inproceedings{moon2022tailoring,
  title={Tailoring Self-Supervision for Supervised Learning},
  author={Moon, WonJun and Kim, Ji-Hwan and Heo, Jae-Pil},
  booktitle={European Conference on Computer Vision},
  year={2022},
  organization={Springer}
}

Contributors and Contact

If there are any questions, feel free to contact with the authors: WonJun Moon (wjun0830@gmail.com), Ji-Hwan Kim (damien911224@gmail.com).

Releases

No releases published

Packages

No packages published

Languages