This project was completed as a part of the Honors portion of the Advanced Learning Algorithms Course on Coursera.
Credit to DeepLearning.AI, Stanford, and the Coursera platform for providing the course materials and guidance.
In this project, I will build upon the previous week's assignment to expand its capabilities in recognizing digits 0 to 9. The primary focus will be on exploring two widely used activation functions: ReLU (Rectified Linear Unit) and Softmax, specifically in the context of multiclass classification. The implementation of a neural network capable of performing multiclass classification will be carried out using Tensorflow, leveraging both ReLU and Softmax activations. By the end of this report, we will have a comprehensive understanding of how these activation functions contribute to the success of the neural network in recognizing a wide range of handwritten digits.