The activation function decides whether a neuron should be activated or not by calculating the weighted sum and further adding bias to it.
The purpose of the activation function is to introduce non-linearity
into the output of a neuron.
The neural network has neurons that work in correspondence with weight, bias, and their respective activation function. In a neural network, update the weights and biases of the neurons on the basis of the error at the output.
This process is known as Back-propagation
. Activation functions make the back-propagation possible since the gradients
are supplied along with the error to update the weights and biases.
The activation function does the non-linear transformation to the input making it capable to learn and perform more complex tasks.
-
Linear Function Equation : Linear function has the equation similar to as of a straight line i.e.
y = x
Range : -inf to +inf -
Sigmoid Function It is a function which is plotted as ‘S’ shaped graph. Equation :
A = 1/(1 + e-x)
Value Range : 0 to 1 Uses : Usually used in output layer of a binary classification, where result is either 0 or 1.
- Tanh Function Equation :
Value Range :- -1 to +1 Nature :- non-linear Uses :- Usually used in hidden layers of a neural network as it’s values lies between -1 to 1.
-
RELU Function It Stands for Rectified linear unit. It is the most widely used activation function. implemented in hidden layers of Neural network. Equation :
A(x) = max(0,x). It gives an output x if x is positive and 0 otherwise.
Value Range :- [0, inf)
- Softmax Function The softmax function is also a type of sigmoid function but is handy when we are trying to handle multi- class classification problems. Nature :- non-linear