-
Notifications
You must be signed in to change notification settings - Fork 3
Understanding Activation functions
Ashwin Phadke edited this page Aug 19, 2020
·
3 revisions
Activation functions can be driving elements to the peroformance of your neural network. sigmoid
, tanh
or whatever your choice of the function be each have their own set of advantages for particular applications. Here we explore activation functions in brief and see how they work when you declare them in a layer like - Conv2D(64, (3,3), activation='relu', input_shape=(28, 28, 1))
.
In artificial neural networks, the activation function of a node defines the output of that node given an input or set of inputs.