Building a multilayer perceptron from scratch
The mathematics and computation that drive neural networks are frequently seen as erudite and impenetrable. A clearly illustrated example of building from scratch a neural network for handwriting recognition is presented in
MLP.ipynb. This tutorial provides a step-by-step overview of the mathematics and code used in many modern machine learning algorithms.
To view this notebook in your browser simply click the
MLP.ipynb file above.
Then in a terminal window:
$ git clone https://github.com/KirillShmilovich/MLP-Neural-Network-From-Scrath $ cd MLP-Neural-Network-From-Scrath $ jupyter-notebook MLP.ipynb