- [x] ReLU - [x] Tanh - [x] Affine - [x] Sigmoid - [x] Leaky ReLU - [x] ELU - [x] Linear - [x] Softmax - [x] Hard Sigmoid - [x] Exponential - [ ] PReLU - [ ] ThresholdedReLU - [x] SELU - [ ] Softplus - [ ] Softsign I will try to implement rest part of activation functions. Any better ideas for activation function?