Skip to content

AkTgWrNsKnKPP/PyramidNet_with_Stochastic_Depth

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

4 Commits
 
 
 
 
 
 

Repository files navigation

This repository is a prototype based on PyramidNet (https://github.com/jhkim89/PyramidNet)

PyramidNet_with_Stochastic_Depth

This repository contains the code for the paper "Deep Pyramidal Residual Networks with Separated Stochastic Depth" (https://arxiv.org/abs/1612.01230).

The code is based on Facebook's implementation of ResNet (https://github.com/facebook/fb.resnet.torch), PyramidNet (https://github.com/jhkim89/PyramidNet) and fb.resnet.torch-lesion-study (https://github.com/gcr/fb.resnet.torch-lesion-study).

Usage

  1. Install Torch (http://torch.ch) and ResNet (https://github.com/facebook/fb.resnet.torch).
  2. Add the files pyramiddrop.lua, pyramidsepdrop.lua and StochasticDrop.lua (https://github.com/gcr/fb.resnet.torch-lesion-study/tree/master/models) to the folder "models".
  3. Change the learning rate schedule in the file train.lua: "decay = epoch >= 122 and 2 or epoch >= 81 and 1 or 0" to "decay = epoch >= 225 and 2 or epoch >= 150 and 1 or 0".
  4. Train our Networks, by running main.lua as below:

To train PyramidDrop-110 (alpha=90) on CIFAR-10 dataset:

th main.lua -dataset cifar10 -nEpochs 300 -LR 0.5 -netType pyramiddrop -batchSize 128 -shareGradInput true

To train PyramidDrop-110 (alpha=90) with 4 GPUs on CIFAR-10 dataset:

Change the code in the file models/init.lua

:threads(function()
  local cudnn = require 'cudnn'
  cudnn.fastest, cudnn.benchmark = fastest, benchmark
end)

to

Change the code in the file models/init.lua
:threads(function()
  local cudnn = require 'cudnn'
  require 'models/StochasticDrop'
  cudnn.fastest, cudnn.benchmark = fastest, benchmark
end)

and

th main.lua -dataset cifar10 -nEpochs 300 -LR 0.5 -netType pyramiddrop -batchSize 128 -shareGradInput true -nGPU 4 -nThreads 8

About

No description, website, or topics provided.

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages