Skip to content

Latest commit

 

History

History

optimizers

Folders and files

NameName
Last commit message
Last commit date

parent directory

..
 
 
 
 
 
 
 
 

Neural Networks From Scratch

🌟 Implementation of Neural Networks from Scratch Using Python & Numpy 🌟

Uses Python 3.7.4

Optimizer Functions

Optimizer Functions help us update the parameters in the most efficient way possible. Optimizers update the weight parameters and bias terms to minimize the loss function to achieve global minimum.

  • Gradient Descent

    W: weights | dW: weights gradient (obtained from loss function) | alpha: learning rate

  • Gradient Descent with Momentum

    vdW: accumulator for weight parameter | beta: momentum term (dampening factor) | dJ/dW: weights gradient (obtained from loss function)

  • RMSProp

  • Adam