Skip to content

Latest commit

 

History

History
19 lines (15 loc) · 778 Bytes

README.md

File metadata and controls

19 lines (15 loc) · 778 Bytes

vanillaNetwork

Designing neurons from scratch without using any libraries

In this assignment, I implement the gradient descent and backpropagation mathematically from scratch, and implement it in Python code. These form as the building blocks of the abstract neural network layers, using which I explore how models learn patterns within data.

I also test the effects of varying the following structural and algorithmic hyperparameters on the learning process

  • Loss / Cost Function
  • Activation Function
  • Optimization algorithm
  • Learning Iterations / Epochs
  • Learning Rate
  • Regularization
  • Train-Test Splits
  • Number of hidden units
  • Initialization

Link to Notebook

AccuraciesVariation