MLP Implementation of a Multi-Layer Perceptron with one variable-neuron hidden layer. Stochastic Gradient Descent (SGD) is used in training. References [1] - The Backpropagation Algorithm for Training Neural Networks