This repository contains the notebook with implentation of most common activation functions.
- Binary
- Linear
- Sigmoid
- Tanh
- ReLU
- Leaky ReLU (LReLU)
- Parametric ReLU (PReLU)
- Exponential Linear Unit (eLU)
- ReLU-6
- Softplus
- Softsign
- Softmax
- Gaussian
- Swish
This notebook is linked with the article What is activation function ? on medium.com