- CIFAR-10 is a dataset that consists of 60,000 32x32 color images, ie 6000 images per class.
- We need to design a model to predict the label of these images correctly.
- Data Source: https://github.com/YoongiKim/CIFAR-10-images
- Activation functions are mathematical functions that are applied to the output of a neural network layer in order to introduce nonlinearity into the network. Nonlinearity is important because it allows neural networks to learn complex relationships between input and output data.
Here are some commonly used activation functions:
- Step Function
- Sigmoid Function
- Tanh Function
- ReLU Function
- Elu Function
- Selu Function