Skip to content
accepted by ACML2017
Lua
Branch: master
Clone or download
Latest commit 4a4a998 Oct 2, 2017
Permalink
Type Name Latest commit message Commit time
Failed to load latest commit information.
models Update CoPaNet.lua Sep 30, 2017
.gitattributes 👾 Added .gitattributes & .gitignore files Sep 23, 2017
.gitignore 👾 Added .gitattributes & .gitignore files Sep 23, 2017
CMaxTable.lua upload Sep 27, 2017
README.md Update README.md Oct 2, 2017
opts.lua upload Sep 27, 2017
train.lua upload Sep 27, 2017

README.md

Deep Competitive Pathway Network (CoPaNet)

This repository contains the code for CoPaNet introduced in the paper "Deep Competitive Pathway Network" by Jia-Ren Chang and Yong-Sheng Chen.

This paper is accepted by Asian Conference on Machine Learning (ACML) 2017.

The code is built on fb.resnet.torch.

Introduction

CoPaNet is a network architecture where multiple pathways compete with each other. This network architecture yields a novel phenomenon which we called "pathway encoding". The pathway encoding means that the routing patterns of features can represent object semantic. The CoPaNet peforms state-of-the-art accuracy on CIFAR-10 and SVHN. On the large scale ILSVRC 2012 (ImageNet) dataset, CoPaNet achieves a similar accuracy as ResNet, but using less amount of parameters.

Figure 1: The concept of pathway encoding.

Figure 2: The pathway encoding on CIFAR-10 test set.

Usage

  1. Install Torch and required dependencies like cuDNN. See the instructions here for a step-by-step guide.
  2. Clone this repo: https://github.com/JiaRenChang/CoPaNet.git

We also provide our implementation of "CMaxTable".
It runs above 2x faster than naive implementation in Torch's nn.

As an example, the following command trains a CoPaNet with depth 164 on CIFAR-10:

th main.lua -netType CoPaNet -dataset cifar10 -batchSize 128 -nEpochs 300 -depth 164

As another example, the following command trains a CoPaNet with depth 26 on ImageNet:

th main.lua -netType CoPaNet -dataset imagenet -data [dataFolder] -batchSize 256 -nEpochs 100 -depth 26 -nGPU 4

Please refer to fb.resnet.torch for data preparation.

Contact

followwar at gmail.com
Any discussions, suggestions and questions are welcome!

You can’t perform that action at this time.