No description, website, or topics provided.
Switch branches/tags
Nothing to show
Clone or download
Koichiro Tamura
Koichiro Tamura SAVE_PATH ->SAVED_PATH
Latest commit a6683c6 Mar 19, 2018
Permalink
Type Name Latest commit message Commit time
Failed to load latest commit information.
dataset debug Sep 3, 2017
model introduce layer normalization Mar 19, 2018
trained_models debug Sep 3, 2017
.gitignore add gitignore Aug 29, 2017
README.md Update README.md Sep 4, 2017
__init__.py add __init__ Sep 4, 2017
hyperparameter.py introduce hyperparameter Mar 19, 2018
train.py SAVE_PATH ->SAVED_PATH Mar 19, 2018
utils.py debug Mar 19, 2018

README.md

Residual Attention Network for Image Classification@tensorflow

Information of paper

  • author
Fei Wang, Mengqing Jiang, Chen Qian, Shuo Yang, Cheng Li, Honggang Zhang, Xiaogang Wang, Xiaoou Tang
  • submission date
Submitted on 23 Apr 2017
  • society
accepted to CVPR2017
  • arxiv
https://arxiv.org/abs/1704.06904
  • abstract
In this work, we propose "Residual Attention Network", a convolutional neural network using attention mechanism
which can incorporate with state-of-art feed forward network architecture in an end-to-end training fashion.
Our Residual Attention Network is built by stacking Attention Modules which generate attention-aware features.
The attention-aware features from different modules change adaptively as layers going deeper.
Inside each Attention Module, bottom-up top-down feedforward structure is used to unfold the feedforward and feedback attention process into a single feedforward process.
Importantly, we propose attention residual learning to train very deep Residual Attention Networks which can be easily scaled up to hundreds of layers.
Extensive analyses are conducted on CIFAR-10 and CIFAR-100 datasets to verify the effectiveness of every module mentioned above.
Our Residual Attention Network achieves state-of-the-art object recognition performance on three benchmark datasets including CIFAR-10 (3.90% error),
CIFAR-100 (20.45% error) and ImageNet (4.8% single model and single crop, top-5 error).
Note that, our method achieves 0.6% top-1 accuracy improvement with 46% trunk depth and 69% forward FLOPs comparing to ResNet-200.
The experiment also demonstrates that our network is robust against noisy labels.

About this code & explanation

purpose

This code is written by Koichiro Tamura in order to understand Residual Attention Network for Image Classification.

explanation

I made document which explains Residual Attention Network for Image Classification(in Japanese). The document would be on SlideShare and the website of Deep Learning JP

how to use

Requirements

  • Anaconda3.x
  • tensorflow 1.x
  • keras 2.x

train

$ python residual-attention-network/train.py

test

Sorry, I have not written scripts for test.