Skip to content

MingtaoGuo/DenseNet-TensorFlow

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

18 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

DenseNet-TensorFlow

DenseNet for cifar10

Introduction

DenseNet is simplely implemented by TensorFlow, the flow chart of DneseNet is shown in follow figure. The greatest advantage of DneseNet is high accuracy and low occupancy rate of memory. This code we use the DenseNet of 40 depth for cifar10 classification, when we save the model in .ckpt, the ckpt file just cost about 2~3M, it's very slight.

How to use the code

  1. Download the cifar10 data, cifar10 address
  2. Unzip the cifar10 data, and put them into the folder 'cifar10'
├── cifar10
     ├── data_batch_1.mat
     ├── data_batch_2.mat
     ├── data_batch_3.mat
     ├── data_batch_4.mat
     ├── data_batch_5.mat
     ├── test_batch.mat
     ...
  1. Execute main.py

Python packages

========================

  1. python3.5
  2. tensorflow 1.4.0
  3. numpy
  4. scipy

========================

Results

Loss Training error Validation error

This experiment we don't use data augmentation, and just train 100 epoches, It doesn't seem to have converged yet. In original paper, 40 depth DenseNet test error is 7%, which is 3 percentage points lower than this code. Due to the poor device, i don't try to train the DenseNet for 300 epoches like the paper. I will very appreciate if somebody can run the code for 300 epoches.

Releases

No releases published

Packages

No packages published

Languages