Skip to content

Different kinds of deep neural networks (DNNs) implemented from scratch using Python and NumPy, with a TensorFlow-like object-oriented API.

License

Notifications You must be signed in to change notification settings

MalayAgr/DeepNeuralNetworksFromScratch

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

Deep Neural Networks From Scratch

Open in Visual Studio Code DeepSource

This is an implementation of deep neural networks using nothing but Python and NumPy. I've taken up this project to complement the Deep Learning Specialization, offered by Coursera and taught by Andrew Ng.

Currently, the following things are supported:

  • Layers:
    • Dense
    • Conv2D
    • DepthwiseConv2D
    • SeparableConv2D
    • Conv2DTranspose
    • MaxPooling2D
    • AveragePooling2D
    • BatchNorm
    • Dropout
    • Flatten
    • Add
    • Concatenate
  • Activations:
    • Linear
    • Sigmoid
    • Tanh
    • ReLU
    • LeakyReLU
    • ELU
    • Softmax
  • Losses
    • BinaryCrossEntropy
    • CategoricalCrossEntropy
    • MeanSquaredError
  • Optimizers:
    • Vanilla SGD
    • SGD with momentum
    • RMSProp
    • Vanilla Adam
    • Adam with AMSGrad.
  • Learning Rate Decay
    • TimeDecay
    • ExponentialDecay
    • CosineDecay

It is also possible to easily add layers, activations, losses, optimizers and decay algorithms.

Note: There is no automatic differentiation. Users, when extending, need to define the necessary derivatives for backpropagation.

Hope you like it! Happy learning!

About

Different kinds of deep neural networks (DNNs) implemented from scratch using Python and NumPy, with a TensorFlow-like object-oriented API.

Topics

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages