This program allows a neural network built using basic Python and Numpy to run on provided inputs. It was created to develop a better understanding of how neural networks work.
This project was based off the neural networks from scratch tutorial by Sentdex.
The neural network uses Numpy whenever possible for more efficient matrix calculations.
So far, the following features of neural networks have been implemented from scratch:
- Dense layers
- ReLU and Softmax activation functions
- Categorical Cross-Entropy Loss
- Forward propagation
- Backpropagation and gradient calculation
- Optimizers with learning rate decay
- Stochastic Gradient Descent (SGD) optimizer with momentum
- Adaptive Gradient (Adagrad) optimizer
- Root Mean Square Propagation (RMSProp) optimizer
- Adaptive Momentum (Adam) optimizer
- L1 and L2 regularization
- Dropout