Skip to content

Latest commit

 

History

History
20 lines (17 loc) · 5.2 KB

README.md

File metadata and controls

20 lines (17 loc) · 5.2 KB

Activation Functions Overview

Understanding activation functions is crucial in comprehending the behavior of artificial neural networks. Given the plethora of variants, I've compiled a concise overview to provide clarity.

You can access the comprehensive implementations of these functions in Python along with accompanying plots through the notebook or Python file. Notably, the Parametric ReLU is akin to Leaky ReLU, but with the leakage coefficient learned as a neural network parameter.

Function Plot Equation Derivative
Binary Step Binary Step Binary Step Equation Binary Step Derivative
Piecewise Linear Piecewise Linear Piecewise Linear Equation Piecewise Linear Derivative
Bipolar Bipolar Bipolar Equation Bipolar Derivative
Sigmoid Sigmoid Sigmoid Equation Sigmoid Derivative
Bipolar Sigmoid Bipolar Sigmoid Bipolar Sigmoid Equation Bipolar Sigmoid Derivative
Hyperbolic Tangent, TanH Hyperbolic Tangent Hyperbolic Tangent Equation Hyperbolic Tangent Derivative
Arctangent, ArcTan Arctangent Arctangent Equation Arctangent Derivative
Rectified Linear Units, ReLU Rectified Linear Units Rectified Linear Units Equation Rectified Linear Units Derivative
Leaky Rectified Linear Units, Leaky ReLU Leaky Rectified Linear Units Leaky Rectified Linear Units Equation Leaky Rectified Linear Units Derivative
Exponential Linear Units, ELU Exponential Linear Units Exponential Linear Units Equation Exponential Linear Units Derivative
SoftPlus SoftPlus SoftPlus Equation SoftPlus Derivative