code for the paper "Adapting Models to Signal Degradation using Distillation", BMVC, 2017
Switch branches/tags
Nothing to show
Clone or download
Fetching latest commit…
Cannot retrieve the latest commit at this time.
Permalink
Failed to load latest commit information.
.gitignore
DistillLoss.m
README.md
cars_get_database.m
cnn_train_dag.m
combine_dataset.m
compute_confusion.m
cub_get_database.m
deploy_network.m
generate_prediction.m
getBatchFn.m
imdb_cnn_train.m
imdb_get_batch.m
initializeNetwork.m
initializeNetworkDag.m
model_setup.m
run_CQD.m
run_test.m
run_train.m
save_images.m
setup.m
vl_nndistillloss.m

README.md

Cross Quality Distillation

Introduction

This repository contains the code for reproducing the results in "Adapting Models to Signal Degradation using Distillation", BMVC, 2017 (Originally with the title "Cross Quality Distillation" on arXiv)

@inproceedings{su2017adapting,
    Author    = {Jong-Chyi Su and Subhransu Maji},
    Title     = {Adapting Models to Signal Degradation using Distillation},
    Booktitle = {British Machine Vision Conference (BMVC)},
    Year      = {2017}
}

Code is tested on Ubuntu 14.04 with MATLAB R2014b and MatConvNet package.
Link to the project page.
Code is borrowed heavily from B-CNN (https://bitbucket.org/tsungyu/bcnn).

Instruction

  1. Follow instructions on VLFEAT and MatConvNet project pages to install them first. Our code is built on MatConvNet version 1.0-beta18.
  2. Change the path in setup.m
  3. Download datasets
  4. Run save_images.m to create degraded images (Need to download and install Structured Edge Detector for generating edge images and tpsWarp for generating distorted images)
  5. Download and put pre-trained vgg models under data/models/ (vgg-m and vgg-vd are used in the paper)
  6. Run run_CQD.m for training all the baseline models and distillation model

Results

Please see Table 1 in the paper.

Acknowledgement

Thanks Tsung-Yu Lin for sharing the codebase and MatConvNet team.
Please contact jcsu@cs.umass.edu if you have any question.