A neural network from scratch written in Rust achieving 99% accuracy on the MNIST dataset. Matrix operations are partly performed on the GPU and Adam is used during the learning phase.
This project is based on the book "Deep Learning from Scratch" by Seth Weidman.
The neural network is implemented in the src directory of the project.
The directory digit_recognition includes
a practical use-case of the neural network on the MNIST dataset. It is divided into the learning phase (digit_recognition/training) and an
interaction (digit_recognition/interaction) to test how the network performs on your own writing.
Create the neural network:
let mut neural_network = NeuralNetwork::<SoftmaxCrossEntropy>::new(vec![
Layer::new_dense(RESOLUTION * RESOLUTION, 512, Some(Operation::leaky_relu())),
Layer::new_dense(512, 256, Some(Operation::leaky_relu())),
Layer::new_dense(256, 128, Some(Operation::leaky_relu())),
Layer::new_dense(128, 10, None),
Layer::new_dense(10, 10, Some(Operation::softmax())),
]);Then train it via:
neural_network.learn(30, &dataset, &target);Given an input matrix, the network predicts the output by
let predicted = neural_network.predict(input)- Implement convolutional layers to improve pattern recognition in images (CNN)
- Rewrite core infrastructure completely on the GPU to speed up training
- Implement dropout to prevent overfitting