Train Deep Residual Network from Scratch or or Fine-tune Pre-trained Model using Matconvnet
Matlab
Latest commit af2c648 Nov 23, 2016 @zhanghang1989 committed on GitHub Update res_imagenet_init.m
Permalink
Failed to load latest commit information.
dataset minc May 19, 2016
dependencies submodule May 3, 2016
figure cifar May 17, 2016
init Update res_imagenet_init.m Nov 23, 2016
utils preActivation Jun 21, 2016
.gitignore fix bug for plain net Sep 15, 2016
.gitmodules submodule May 3, 2016
README.md v1.0 Sep 28, 2016
res_cifar.m fix bug for plain net Sep 15, 2016
res_finetune.m fine-tune May 15, 2016
res_imagenet.m imagenet May 19, 2016
run_cifar_experiments.m pass the parameters through Sep 26, 2016
run_experiments.m none-relu May 15, 2016
setup.m init May 3, 2016

README.md

ResNet-Matconvnet

This repository is a Matconvnet re-implementation of "Deep Residual Learning for Image Recognition",Kaiming He, Xiangyu Zhang, Shaoqing Ren, Jian Sun. You can train Deep Residual Network on ImageNet from Scratch or fine-tune pre-trained model on your own dataset. This repo is created by Hang Zhang.

Table of Contents

  1. Get Started
  2. Train from Scratch
  3. Fine-tune Your Own
  4. Changes

Get Started

The code relies on vlfeat, and matconvnet, which should be downloaded and built before running the experiments. You can use the following commend to download them.

git clone -b v1.0 --recurse-submodules https://github.com/zhanghang1989/ResNet-Matconvnet.git

If you have problem with compiling, please refer to the link.

Train from Scratch

  1. Cifar. Reproducing Figure 6 from the original paper.

    run_cifar_experiments([20 32 44 56 110], 'plain', 'gpus', [1]);
    run_cifar_experiments([20 32 44 56 110], 'resnet', 'gpus', [1]);

    Cifar Experiments

    Reproducing the experiments in Facebook blog. Removing ReLU layer at the end of each residual unit, we observe a small but significant improvement in test performance and the converging progress becomes smoother.

    res_cifar(20, 'modelType', 'resnet', 'reLUafterSum', false,...
        'expDir', 'data/exp/cifar-resNOrelu-20', 'gpus', [2])
    plot_results_mix('data/exp','cifar',[],[],'plots',{'resnet','resNOrelu'})
  2. Imagenet2012. download the dataset to data/ILSVRC2012 and follow the instructions in setup_imdb_imagenet.m.

    run_experiments([50 101 152], 'gpus', [1 2 3 4 5 6 7 8]);
  3. Your own dataset.

    run_experiments([18 34],'datasetName', 'minc',...
    'datafn', @setup_imdb_minc, 'nClasses', 23, 'gpus', [1 2]);

Fine-tune Your Own

  1. Download

  2. Fine-tuning

    res_finetune('datasetName', 'minc', 'datafn',...
    @setup_imdb_minc, 'gpus',[1 2]);

Changes

  1. 06/21/2016:
  2. 05/17/2016:
    • Reproducing the experiments in Facebook blog, removing ReLU layer at the end of each residual unit.
  3. 05/02/2016:

    • Supported official Matconvnet version.
    • Added Cifar experiments and plots.
  4. 04/27/2016: Re-implementation of Residual Network: