My neural network implementation using only NumPy, featuring dense layers with ReLU activation, Softmax with categorical cross-entropy loss, forward propagation, and backpropagation.
This is a toy project so a lot of proper training and inference methods are missing for sure, but successfuly handles backpropagation and forward propagation tasks.
Tested with the MNIST dataset and achieved over %90 test accuracy with 2 dense and 1 softmax output layer.
To test the project, open terminal and run "run.py".
This project uses the MNIST dataset, which is publicly available from Yann LeCun's website.