Skip to content

Official implementation of paper "ScatSimCLR: self-supervised contrastive learning with pretext task regularization for small-scale datasets", accepted at ICCV 2021 2nd Visual Inductive Priors for Data-Efficient Deep Learning Workshop

License

Notifications You must be signed in to change notification settings

vkinakh/scatsimclr

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

59 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

Python 3.8 PWC PWC

ScatSimCLR: self-supervised contrastive learning with pretext task regularization for small-scale datasets

This repo contains official Pytorch implementation of the paper:

ScatSimCLR: self-supervised contrastive learning with pretext task regularization for small-scale datasets accepted at ICCV 2021 workshop 2nd Visual Inductive Priors for Data-Efficient Deep Learning Workshop
Vitaliy Kinakh, Olga Taran, Sviatoslav Voloshynovskiy

Paper

  • 🏆 SOTA on CIFAR20 Unsupervised Image Classification. Check out Papers With Code

Contents

  1. Introduction
  2. Installation
  3. Training
  4. Evaluation
  5. Results
  6. Citation

Introduction

In this paper, we consider a problem of self-supervised learning for small-scale datasets based on contrastive loss. Such factors as the complexity of training requiring complex architectures, the needed number of views produced by data augmentation, and their impact on the classification accuracy are understudied problems. We consider an architecture of contrastive loss system such as SimCLR, where baseline model is replaced by geometrically invariant “hand-crafted” network ScatNet with small trainable adapter network and argue that the number of parameters of the whole system and the number of views can be considerably reduced while practically preserving the same classification accuracy.
In addition, we investigate the impact of regularization strategies using pretext task learning based on an estimation of parameters of augmentation transform such as rotation and jigsaw permutation for both traditional baseline models and ScatNet based models.
Finally, we demonstrate that the proposed architecture with pretext task learning regularization achieves the state-of-the-art classification performance with a smaller number of trainable parameters and with reduced number of views.

We outperform state-of-the-art methods, in particular +8.9% on CIFAR20, and INSERT HERE on STL10 in terms of classification accuracy.

Installation

Conda installation

conda env create -f env.yml

Training

To run training without pretext task, fill the config file. Example of detailed config file for training without pretext task is config.yaml.

Then run

python main.py --mode unsupervised --config <path to config file>

To run training with pretext task, fill config file. Example of detailed config file for training with pretext task is config_pretext.yaml.

Then run

python main.py --mode pretext --config <path to config file>

Evaluation

To run evaluation fill the config file the same way as config for training without pretext task: config.path. Put path to the model in fine_tune_from.

Then run

python evaluate.py --config <path to config file>

Results

Dataset Top-1 accuracy Model Image size J L Download link
STL10 85.11% ScatSimCLR30 (96, 96) 2 16 Download
CIFAR20 63.86% ScatSimCLR30 (32, 32) 2 16 Download

Citation

@inproceedings{
    kinakh2021scatsimclr,
    title={ScatSim{CLR}: self-supervised contrastive learning with pretext task regularization for small-scale datasets},
    author={Vitaliy Kinakh and Slava Voloshynovskiy and Olga Taran},
    booktitle={2nd Visual Inductive Priors for Data-Efficient Deep Learning Workshop},
    year={2021},
    url={https://openreview.net/forum?id=IQ87KPOWyg1}
}

About

Official implementation of paper "ScatSimCLR: self-supervised contrastive learning with pretext task regularization for small-scale datasets", accepted at ICCV 2021 2nd Visual Inductive Priors for Data-Efficient Deep Learning Workshop

Topics

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages