Activation functions for neural networks.
These activation functions are included:
- Swish (
x / (1 + exp(-x))
) - Sigmoid (
1 / (1 + exp(-x))
) - SoftPlus (
log(1 + exp(x))
) - Gaussian01 (
exp(-(x * x) / 2.0)
) - Sin (
math.Sin(math.Pi * x)
) - Cos (
math.Cos(math.Pi * x)
) - Linear (
x
) - Inv (
-x
) - ReLU (
x >= 0 ? x : 0
) - Squared (
x * x
)
These math
functions are included just for convenience:
- Abs (
math.Abs
) - Tanh (
math.Tanh
)
One functions that takes two arguments is also included:
- PReLU (
x >= 0 ? x : x * a
)
- Go 1.11 or later.
- License: MIT
- Version: 0.3.2
- Author: Alexander F. Rødseth <xyproto@archlinux.org>