A feed-forward neural network library that uses the computational graph approach to compute the gradients. This library supports ANNs of arbitrary size as defined by the user.
This implementation makes use of just Python and Numpy. Matplotlib was used for testing the network and plotting graphs to observe it's learning.
- Linear (No activation, only linear transform)
- L1 Loss
- L2 Loss
- Cross Entropy
- SVM Loss
When contributing to this repository, please first discuss the change you wish to make via issue, email, or any other method with the owners of this repository before making a change. Ensure any install or build dependencies are removed before the end of the layer when doing a build. Update the README.md with details of changes to the interface, this includes new environment variables, exposed ports, useful file locations and container parameters.
This project is licensed under the MIT License - see the LICENSE.md file for details
(Computational graph image source: https://colah.github.io/)