Skip to content

Commit

Permalink
Added BN+Sigmoid and TanH networks
Browse files Browse the repository at this point in the history
  • Loading branch information
ducha-aiki committed Mar 5, 2016
1 parent 2d7739f commit 429cd60
Show file tree
Hide file tree
Showing 15 changed files with 104,705 additions and 22 deletions.
3 changes: 3 additions & 0 deletions Activations.md
Original file line number Diff line number Diff line change
Expand Up @@ -15,6 +15,7 @@ Because LRN layers add nothing to accuracy, they were removed for speed reasons
| -------|---------:| -------:|:-----------|
| [ReLU](http://machinelearning.wustl.edu/mlpapers/paper_files/icml2010_NairH10.pdf) |0.470| 2.36 | With LRN layers|
| ReLU |0.470| 2.36 | No LRN, as in rest |
| TanH |0.401| 2.78 | |
| [VLReLU](https://web.stanford.edu/~awni/papers/relu_hybrid_icml2013_final.pdf) |0.469| 2.40|y=max(x,x/3)|
| [RReLU](http://arxiv.org/abs/1505.00853) |0.478| 2.32| |
| [Maxout](http://arxiv.org/abs/1302.4389) |0.482| 2.30| sqrt(2) narrower layers, 2 pieces|
Expand All @@ -32,6 +33,8 @@ Because LRN layers add nothing to accuracy, they were removed for speed reasons
| PReLU |**0.503**| **2.19** | |
| ELU |0.498| 2.23 | |
| Maxout |0.487| 2.28| |
| Sigmoid |0.475| 2.35| |


![CaffeNet128 test accuracy](/logs/activations/img/0.png)

Expand Down
2 changes: 2 additions & 0 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -29,6 +29,7 @@ On-going evaluations with graphs:
| -------|---------:| -------:|:-----------|
| [ReLU](http://machinelearning.wustl.edu/mlpapers/paper_files/icml2010_NairH10.pdf) |0.470| 2.36 | With LRN layers|
| ReLU |0.470| 2.36 | No LRN, as in rest |
| TanH |0.401| 2.78 | |
| [VLReLU](https://web.stanford.edu/~awni/papers/relu_hybrid_icml2013_final.pdf) |0.469| 2.40|y=max(x,x/3)|
| [RReLU](http://arxiv.org/abs/1505.00853) |0.478| 2.32| |
| [Maxout](http://arxiv.org/abs/1302.4389) |0.482| 2.30| sqrt(2) narrower layers, 2 pieces|
Expand Down Expand Up @@ -258,6 +259,7 @@ So in all next experiments, BN is put after non-linearity
| PReLU |**0.503**| **2.19** | |
| ELU |0.498| 2.23 | |
| Maxout |0.487| 2.28| |
| Sigmoid |0.475| 2.35| |

### BN and dropout

Expand Down
346 changes: 324 additions & 22 deletions ResNetGenerator.ipynb

Large diffs are not rendered by default.

1 change: 1 addition & 0 deletions batchnorm.md
Original file line number Diff line number Diff line change
Expand Up @@ -37,6 +37,7 @@ So in all next experiments, BN is put after non-linearity
| PReLU |**0.503**| **2.19** | |
| ELU |0.498| 2.23 | |
| Maxout |0.487| 2.28| |
| Sigmoid |0.475| 2.35| |

### BN and dropout

Expand Down
File renamed without changes.
Loading

0 comments on commit 429cd60

Please sign in to comment.