This repo aims to realize the auto encoder structure with Residual Network (ResNet) by the TensorFlow library. The residual loss strategy in the ResNet definately benefits the training for very deep neural networks and keep it from the saturation and performance decreasing problems.
Some tricks will be applied for better training the network to avoide over-fitting, accelerate convergence to local optima, etc.
- ReLU activation function
- Batch normalization
- Exponential decreacing learning rate
- Dropout
- Regularization
Firstly the bottleneck structure, the basis of the ResNet, is realized. Then the block class is built, composed of multiple bottlenecks. By means of them, the ResNet based encoding part can be constructed. As for the decoder part, which is conventionally as the symmetry as the encoder, it can be formed by reversing the encoder. A diagram is illustrated as follow, which is similar to the famous skip connection
Some python packages should be installed before appying the nets, which are listed as follows,
Also, CUDA is required if you want to run the codes by GPU, a Chinese guide is provided here..
- Zhixian MA <
zx at mazhixian.me
>
Unless otherwise declared:
- Codes developed are distributed under the MIT license;
- Documentations and products generated are distributed under the Creative Commons Attribution 3.0 license;
- Third-party codes and products used are distributed under their own licenses.