Compare SELUs (scaled exponential linear units) with other activations on MNIST, CIFAR10, etc.
-
Updated
Nov 1, 2017 - Python
Compare SELUs (scaled exponential linear units) with other activations on MNIST, CIFAR10, etc.
Interesting python codes to tackle simple machine/deep learning tasks
"The 'Activation Functions' project repository contains implementations of various activation functions commonly used in neural networks. "
Add a description, image, and links to the selu topic page so that developers can more easily learn about it.
To associate your repository with the selu topic, visit your repo's landing page and select "manage topics."