Skip to content
customize this building block to make all the residual connections you can dream of
Python
Branch: master
Clone or download
Fetching latest commit…
Cannot retrieve the latest commit at this time.
Permalink
Type Name Latest commit message Commit time
Failed to load latest commit information.
.gitignore Initial commit Jul 22, 2016
LICENSE Initial commit Jul 22, 2016
README.md fixed typo and broken github links Jul 26, 2016
residual.py fixed typo and broken github links Jul 26, 2016
toy_model.png

README.md

Keras Customizable Residual Unit

This building block shows you how to easily incorporate custom residual connections into your Keras neural networks. I spent longer than I'd have liked trying to add these kind of blocks to my models and thought I'd share it with the world.

Run python residual.py to get a model built and a png to peruse.

This is a simplified implementation of the basic (no bottlenecks) full pre-activation residual unit from He, K., Zhang, X., Ren, S., Sun, J., Identity Mappings in Deep Residual Networks.

Visit the reference implementation at keras-resnet to see a full model with bottlenecks and downsampling between units included.

Further credit to: Keunwoo Choi, Nicholas Dronen, and Alejandro Newell for creating the residual blocks I based this off of.

You can’t perform that action at this time.