Skip to content

kanchanchy/Multilayer-Neural-Network-from-Scratch

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

5 Commits
 
 
 
 
 
 
 
 

Repository files navigation

Multi-layer Neural Network from Scratch using Numpy

This repository shows the sketch implementation of a multi-layer neural network using numpy. The utils.py file implements various functions used in a deep neural network. The multi_layer_nn_sketch.py file tests the functions implemented in utils.py file. It also trains and tests a multi-layer neural network using the functions from utils.py.

Run the Code

  1. Run datasets.py to download the MNIST dataset.
  2. Run multi_layer_nn_sketch.py to test the implemented functions and the multi-layer neural network. You can test one function at a time by commenting other codes in multi_layer_nn_sketch.py file.

Functions Implemented in utils.py

  1. Parameter initialization
  2. Relu activation
  3. Gradient of relu activation
  4. Linear activation
  5. Derivative of linear activation
  6. Softmax cross entropy loss
  7. One hot representation of classes
  8. Derivative of softmax cross entropy loss
  9. Forward dropout
  10. Backward dropout
  11. Single layer forward propagation
  12. Multi-layer forward propagation
  13. Single layer backward propagation
  14. Multi-layer backward propagation
  15. Classification/Prediction
  16. Calculating momentum
  17. Updating parameters with momentum
  18. Multi-layer neural network

About

No description, website, or topics provided.

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages