Skip to content

Latest commit

 

History

History
42 lines (22 loc) · 2.97 KB

README.md

File metadata and controls

42 lines (22 loc) · 2.97 KB

Optical Neural Networks with Quantization-Aware Training (QAT)

This repository contains the trained model and training scripts for the neural network executed on the optical matrix-vector multiplier demonstrated in the following paper:

Tianyu Wang, Shi-Yuan Ma, Logan G. Wright, Tatsuhiro Onodera, Brian Richard and Peter L. McMahon. "An optical neural network using less than 1 photon per multiplication" Nature communications 13, 123 (2022) https://doi.org/10.1038/s41467-021-27774-8

The device control scripts for experimental implementation are available here.

To improve the robustness of the optical neural networks (ONNs) against shot noise, we employed quantization-aware training (QAT), which quantizes the activations and weights of neurons, and allows classification with moderate numerical precision. Besides the neural network training scripts, this repository also includes scripts for simulating neural network performance under the standard quantum limit (SQL).

Helper functions for parallelizing functions (e.g., neural network training functions) on GPUs.

Anacoda environment setup information.

The minimalist Python script for training fully-connected neural networks with QAT, requiring only PyTorch (1.7.0) and torchvision (0.8.1) to run

A Jupyter notebook that trains batches of neural networks with QAT that supports additional functions (parallel training on GPUs, hyperparameter searching, neural architecture search, and training results logging). The notebook requires additional packages: Ray (1.0.0), Optuna (1.5.0), wandb (0.9.7).

A Jupyter notebook that tests the accuracy of trained neural networks with simulated photon shot noise under varying photon budget (i.e., photons per scalar multiplication).

A neural network model with 3 hidden layers trained with QAT. It was the one finally executed on the experimental setup of the optical matrix-vector multiplier.

License

The code in this repository is released under the following license:

Creative Commons Attribution 4.0 International

A copy of this license is given in this repository as license.txt.