Note: this is an old repository I created when I was first learning about neural network and wanted to be sure I understood them well. Given the current popularity of Neural Networks, if want to learn then you're in luck because there are tons of amazing educational resources (such as Michael Nielsen's Neural networks and Deep Learning or the Deep Learning book by Goodfellow, Bengio and Courville) and great frameworks to play with, such as Tensorflow, Keras or PyTorch.
Want to understand how neural nets work, and see a simple implementation? I did, which is why I created this simple and (hopefully) clear implementation of a neural network. It greatly improved my understanding of how they work.
This repository is the companion to my blog posts about learning how neural networks work and learn. There I write the theory behind them, and the maths in the clearest way I could. I don't oversimplify things or make things too vague.
At the end, you'll know the precise maths required to make the neural net work.
- A simple 2 layer Neural Net (to start learning), then a n-layer Neural Net, with prediction and training, using the classical backpropagation algorithm (with momentum)
Want a couple of example of how to use the code? Here some basic things I've built:
- Let's use neural networks to predict the output of the XOR function. It's absurd, but a good illustration.
- Then, I built a simple simple OCR system, which regognizes hand-written digits of the MNIST database
I also created a couple of script to better understand how to test and see the performance of different configurations:
- This scripts tests different designs
- And if you want to visualize the results, you can make pretty graphs