Skip to content

joe-arul/vanillaNetwork

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

4 Commits
 
 
 
 
 
 

Repository files navigation

vanillaNetwork

Designing neurons from scratch without using any libraries

In this assignment, I implement the gradient descent and backpropagation mathematically from scratch, and implement it in Python code. These form as the building blocks of the abstract neural network layers, using which I explore how models learn patterns within data.

I also test the effects of varying the following structural and algorithmic hyperparameters on the learning process

  • Loss / Cost Function
  • Activation Function
  • Optimization algorithm
  • Learning Iterations / Epochs
  • Learning Rate
  • Regularization
  • Train-Test Splits
  • Number of hidden units
  • Initialization

Link to Notebook

AccuraciesVariation

About

Math behind deep neural network learning

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published