Skip to content

Effective and efficient dropout for deep convolutional neural networks

Notifications You must be signed in to change notification settings

ooibc88/dropout

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

3 Commits
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

Customizable and Effective Dropout

version python pytorch

This repository contains code for the paper Effective and Efficient Dropout for Deep Convolutional Neural Networks. Customizable and effective Dropout blocks have been proposed to support complex analytics with Convolutional Neural Networks.

The illustration of convolutional transformations with 4 structural levels of dropout:

  1. Dropout, or drop-neuron, gates input neurons in operation 1;
  2. Drop-channel replaces identity mapping in operation 2 with operation 3, random sampling and gating on channels;
  3. Drop-path is introduced to F conv in operation 4;
  4. Drop-layer to the shortcut connection in operation 5.

The illustration of the example proposed building block:

The repo includes:

  1. example models (/models)
  2. codes for dropout training (train.py)
  3. codes to support different structural levels of dropout (models/convBlock.py)
    • supporting effective dropout with customizable building blocks (models/convBlock/conv_block)

Training

Dependencies
* python 3.7.3
* pytorch 1.2.0
* torchvision 0.4.0
Model Training
Example training code:
CUDA_VISIBLE_DEVICES=0 python train.py --net_type=resnet --depth 110 --arg1 1 --epoch 164 --weight_decay 1e-4 --block_type 0 --drop_type=1 --drop_rate=0.1 --exp_name resnet_dropChannel --report_ratio

Please check help info in argparse.ArgumentParser (train.py) for more details 

Contact

To ask questions or report issues, please open an issue here or can directly send us an email.

About

Effective and efficient dropout for deep convolutional neural networks

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages