Supplemental Code for ICML Submission
This directory contains files associated with the ICML submission "Memory-Optimal Direct Convolutions for Maximizing Classification Accuracy in Embedded Applications".
The main dependencies are shown below.
- Python 2.7
- TensorFlow 1.13
- Keras 2.2
The contents are best understood as being split by different segments of the (relatively linear) work flow. Thus, understanding the files in the same order may make it easier to understand.
MNIST - Data Generator.ipynb- generates the augmented dataset.
MNIST - Sweeps.ipynb- finds the optimal architecture through a brute-force sweep of hyperparameters.
MNIST - Training.ipynb- runs training on the network identified in
Sweepsusing data generated in
MNIST - Arduino.ipynb- interfaces to and verifies accuracy of the Arduino. It uses the final network parameters from
MNIST - Plots.ipynb- generates plots for the paper, including interesting relationships between network size and accuracy.
Arduino/cnn.cpp- The Arduino code for reading in a full neural network specification and a stream of images and outputting the image class, all via serial communications.
In addition, see the helper files:
network_parameterization.py- establishes an easy way of defining networks so they can be analyzed for memory usage and also built very easily.
quantization_layers.py- defines custom Keras layers for easy quantization during training.
Some results have also been included for convenience:
sweeps- contains sweeps saved during the sweep process.
models- contains models saved during the training process.
However, data files have not been included, because they are too large.
The Arduino code uses Arduino-Makefile. To run it:
- Connect your Arduino (we used an Arduino nano with the ATmega328P chip).
- Navigate to
MNIST - Arduino.ipynbfor an example of Arduino communications to Python.