Skip to content

Neural Network implementation from scratch for MNIST using Cairo 1.0

License

Notifications You must be signed in to change notification settings

franalgaba/neural-network-cairo

Repository files navigation

CI Badge

Neural Network for MNIST in Cairo 1.0

Implementation of a Neural Network from scratch using Cairo 1.0 for MNIST predictions.

The NN has a simple two-layer architecture:

  • Input layer 𝑎[0] will have 784 units corresponding to the 784 pixels in each 28x28 input image.
  • A hidden layer 𝑎[1] will have 10 units with ReLU activation.
  • Output layer 𝑎[2] will have 10 units corresponding to the ten digit classes with softmax activation.

Functionalities implemented in Cairo 1.0:

  • Vector implementation with operations: sum, max, min, argmax.
  • Matrix implementation with operations: get, dot, add, len.
  • Tensor implementation.
  • 8-bit weight quantization based in ONNX quantization.
  • ReLU activation.
  • Forward propagation of NN.
  • Predict method for NN.
  • Pseudo-softmax activation optimized for quantized values.
  • Weight loading into Cairo NN from trained Tensorflow NN.
  • MNIST inferences using Cairo NN.

Built with auditless/cairo-template

Working with the project

Currently supports building and testing contracts.

Build

Build the contracts.

$ make build

Test

Run the tests in src/test:

$ make test

Format

Format the Cairo source code (using Scarb):

$ make fmt

Credits

About

Neural Network implementation from scratch for MNIST using Cairo 1.0

Topics

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published