Artificial Neural Networks Activation Functions
- Rectified Linear Unit (ReLU)
- Logistic Sigmoid
- Hyperbolic Tangent
PM> Install-Package ann.activators.koryakinp
var activator = ActivatorFactory.Produce(ActivatorType.Relu);
var x = activator.CalculateValue(3.5);
var dx = activator.CalculateDeriviative(3.5);
Pavel koryakin koryakinp@koryakinp.com
This project is licensed under the MIT License - see the LICENSE.md for details.