Skip to content

Understanding Activation functions

Ashwin Phadke edited this page Aug 19, 2020 · 3 revisions

Background

Activation functions can be driving elements to the peroformance of your neural network. sigmoid, tanh or whatever your choice of the function be each have their own set of advantages for particular applications. Here we explore activation functions in brief and see how they work when you declare them in a layer like - Conv2D(64, (3,3), activation='relu', input_shape=(28, 28, 1)).

Activation function

In artificial neural networks, the activation function of a node defines the output of that node given an input or set of inputs.