Skip to content

Derives and implements neural networks with arbitrarily deep layers. Enables forward and backward propagation to make predictions and compute the gradient of the network respectively. Uses Cross Entropy Loss for the cost function and exposes different activation functions including SoftMax, ReLU, and Sigmoid.

Notifications You must be signed in to change notification settings

elittman27/neural-networks

Repository files navigation

Neural-Networks

Derives and implements neural networks with arbitrarily deep layers, convolutional layers, and pooling layers. Enables forward and backward propagation to make predictions and compute the gradient of the network respectively. Uses Cross Entropy Loss for the cost function and exposes different activation functions including SoftMax, ReLU, and Sigmoid.

To run all the tests, run:

python -m unittest -v

The most relevant code samples exist in the following files:

  • Convolutions and Layers: neural_networks/layers.py
  • Overall network: neural_networks/models.py
  • Cost functions: neural_networks/losses.py
  • Activation functions: neural_networks/activations.py

About

Derives and implements neural networks with arbitrarily deep layers. Enables forward and backward propagation to make predictions and compute the gradient of the network respectively. Uses Cross Entropy Loss for the cost function and exposes different activation functions including SoftMax, ReLU, and Sigmoid.

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published