Skip to content

mzarvandi/CPC-Rot-Exemplar

Repository files navigation

Contrastive Predictive Coding

PyTorch implementation of the following papers:

A. v. d. Oord, Y. Li, and O. Vinyals, Representation learning with contrastive predictive coding

O. J. H ́enaff, A. Srinivas, J. D. Fauw, A. Razavi, C. Doersch, S. M. A. Eslami, and A. van den Oord Data-Efficient Image Recognition with Contrastive Predictive Coding

Also there is VGG16 added as encoder. Rotation Prediction from UNSUPERVISED REPRESENTATION LEARNING BY PRE- DICTING IMAGE ROTATIONS and Exemplar from Exemplar VAE: Linking Generative Models, Nearest Neighbor Retrieval, and Data Augmentation are added:

Included is environment.yml

Usage

There are two training functions, one for the unsupervised training and one for supervised training.

  • Viewing all command-line options
    python train_classifier.py -h
    
    python train_CPC.py -h
    
  • Training a fully supervised model
    python train_classifier.py --fully_supervised --dataset stl10 --encoder resnet18
    
  • Training Resnet14 on STL10 with CPCV1 - Unsupervised Stage
    python  train_CPC.py --dataset stl10 --epochs 300 --crop 64-0 --encoder resnet14 --norm none --grid_size 7 --pred_steps 5 --pred_directions 1
    
  • Training Wideresnet-28-2 on CIFAR10 with CPCV2 (and a smaller grid size) - Unsupervised Stage
    python train_CPC.py --dataset  cifar10 --epochs 500 --crop 30-2 --encoder wideresnet-28-2 --norm layer --grid_size 5 --pred_steps 3 --pred_directions 4 --patch_aug 
    
  • Training Wideresnet-28-2 on CIFAR10 with CPCV2 (and a smaller grid size) - Supervised Stage with 10,000 labeled images
    python train_classifier.py --dataset cifar10 --train_size 10000 --epochs 100 --lr 0.1 --crop 30-2 --encoder wideresnet-28-2 --norm layer --grid_size 5 --pred_directions 4 --cpc_patch_aug --patch_aug --model_num 500    
    

About

Implementation of Contrastive Predictive Coding plus rotation predicting and exemplar.

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages