Skip to content

⭐ Implementation of Neural Networks from Scratch Using Python & Numpy ⭐

License

Notifications You must be signed in to change notification settings

RyanDsilva/nn-from-scratch

Folders and files

NameName
Last commit message
Last commit date

Latest commit

Β 

History

69 Commits
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 

Repository files navigation

Neural Networks From Scratch

🌟 Implementation of Neural Networks from Scratch Using Python & Numpy 🌟

Uses Python 3.7.4

This repository has detailed math equations and graphs for every feature implemented that can be used to serve as basis for greater, in-depth understanding of Neural Networks. Basic understanding of Linear Algebra, Matrix Operations and Calculus is assumed.

Contents πŸ“‘

Setup πŸ’»

git clone <url>
pip install -r requirements.txt

Here, Keras is used just to load the MNIST dataset

Usage πŸ“”

  • Tune hyperparameters in config.py
  • Run the following command
python main.py

Output:

$ python main.py
epoch 1/30      error=0.173172
epoch 2/30      error=0.077458
epoch 3/30      error=0.058955
epoch 4/30      error=0.048161
.....
.....
.....
epoch 26/30     error=0.010333
epoch 27/30     error=0.009944
epoch 28/30     error=0.009602
epoch 29/30     error=0.009298
epoch 30/30     error=0.009045

Predicted Values: 
[array([[-4.31825197e-04, -1.80361575e-03,  6.84263430e-03,
        -1.42045839e-02, -1.32599433e-02, -3.67077777e-02,
         3.73258781e-02,  0.97446495,  4.59079629e-02,
        -8.94465105e-03]]), 
array([[ 0.0461294 , -0.00845601,  0.8578162 , -0.00272202,  0.01397735,
         0.17131938,  0.21350745, -0.06529926,  0.01975232, -0.10840968]])]
True Values: 
[[0. 0. 0. 0. 0. 0. 0. 1. 0. 0.]
 [0. 0. 1. 0. 0. 0. 0. 0. 0. 0.]]

Roadmap πŸ“‘

  • Activation Functions
    • Linear
    • Sigmoid
    • Tanh
    • Tanh
    • ReLu
    • LeakyReLu
    • SoftMax
    • GeLu
  • Loss Functions
    • MAE
    • MSE
    • CrossEntropy
  • Optimizers Functions
    • Gradient Descent
    • Gradient Descent w/ Momentum
    • Nestrov's Accelerated
    • RMSProp
    • Adam
  • Regularization
    • L1
    • L2
    • Dropout
  • Layer Architecture
  • Wrapper Classes
  • Hyperparameters Configuration
  • Clean Architecture
  • UI (Similar to Tensorflow Playground)
This project is not meant to be production ready but instead serve as the foundation repository to understand the in-depth working of Neural Networks down to the mathematics of the task.
Collaborations in implementing and maintaining this project are welcome. Kindly reach out to me if interested.

Contributers 🌟

References πŸ“š

Β© 2020 Ryan Dsilva

About

⭐ Implementation of Neural Networks from Scratch Using Python & Numpy ⭐

Topics

Resources

License

Code of conduct

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Contributors 4

  •  
  •  
  •  
  •  

Languages