This repository shows the sketch implementation of a multi-layer neural network using numpy. The utils.py file implements various functions used in a deep neural network. The multi_layer_nn_sketch.py file tests the functions implemented in utils.py file. It also trains and tests a multi-layer neural network using the functions from utils.py.
- Run datasets.py to download the MNIST dataset.
- Run multi_layer_nn_sketch.py to test the implemented functions and the multi-layer neural network. You can test one function at a time by commenting other codes in multi_layer_nn_sketch.py file.
- Parameter initialization
- Relu activation
- Gradient of relu activation
- Linear activation
- Derivative of linear activation
- Softmax cross entropy loss
- One hot representation of classes
- Derivative of softmax cross entropy loss
- Forward dropout
- Backward dropout
- Single layer forward propagation
- Multi-layer forward propagation
- Single layer backward propagation
- Multi-layer backward propagation
- Classification/Prediction
- Calculating momentum
- Updating parameters with momentum
- Multi-layer neural network