Spiking Neural Networks (SNNs) are promising for neuromorphic computing due to their biological plausibility and energy efficiency. However, training methods like Backpropagation Through Time (BPTT) and Real Time Recurrent Learning (RTRL) remain computationally intensive. This work introduces an integer-only, online training algorithm using a mixed-precision approach to improve efficiency and reduce memory usage by over 60%.The method replaces floatingpoint operations with integer arithmetic to enable hardware-friendly implementation. It generalizes to Convolutional and Recurrent SNNs (CSNNs, RSNNs), showing versatility across architectures. Evaluations on MNIST and the Spiking Heidelberg Digits (SHD) dataset demonstrate that mixed-precision models achieve accuracy comparable to or better than full-precision baselines using 16-bit shadow and 8- or 12-bit inference weights. Despite some limitations in low-precision and deeper models, performance remains robust. In conclusion, the proposed integer-only online learning algorithm presents an effective solution for efficiently training SNNs, enabling deployment on resource-constrained neuromorphic hardware without sacrificing accuracy.
- Spiking Neural Network (SNN)
- Convolutional Spiking Neural Network (CSNN)
- Recurrent Spiking Neural Network (RSNN)
- MNIST
- Spiking Heidelberg Digits (SHD)
Integer-Arithmetic-SNN-Learning/
├── core/ # Shared utilities and configurations
├── models/ # Model implementations
└── experiments/ # Experiments organized by dataset
MP 16-8: Mixed precision configuration with 16-bit shadow weights and 8-bit inference weights FP32: Full precision floating point
Model | MP 16-4 | MP 16-8 | MP 16-12 | MP 16-16 | FP32 |
---|---|---|---|---|---|
SNN | 49.85 ± 1.29 % | 50.10 ± 1.14 % | 62.06 ± 1.16 % | 61.92 ± 1.53 % | 55.27 ± 1.97 % |
SNN [1] | - | - | - | - | 48.10 ± 1.60 % |
RSNN | 57.62 ± 0.95 % | 64.63 ± 1.49 % | 70.50 ± 1.43 % | 67.75 ± 1.34 % | 71.64 ± 0.95 % |
RSNN [1] | - | - | - | - | 71.40 ± 1.90 % |
[1] Cramer, B., Stradmann, Y., Schemmel, J., Zenke, F.: The Heidelberg Spiking Data Sets for the Systematic Evaluation of Spiking Neural Networks. IEEE Transactions on Neural Networks and Learning Systems 33(7), 2744–2757 (Jul 2022). https://doi.org/10.1109/TNNLS.2020.3044364, https://ieeexplore.ieee.org/ document/9311226, conference Name: IEEE Transactions on Neural Networks and Learning Systems
See paper for full details and more results.
Python version: 3.10.14
- Clone the repository:
git clone <https://github.com/ERNIS-LAB/Integer-Arithmetic-SNN-Learning.git> cd Integer-Arithmetic-SNN-Learning
- Install dependencies:
pip install -r requirements.txt
Note: For the full list of packages and exact versions used in our experiments, please refer to
requirements-lock.txt
.
-
MNIST config:
- The training code automatically downloads the MNIST dataset using Torchvision
- Train / val / test split
-
SHD config:
- The training code automatically downloads the SHD dataset using spikingjelly
- Train / test split (for comparison purposes) --> Test accuracy is referred as val_acc
@inproceedings{gomez2025integerSNN,
title={Full Integer Arithmetic Online Training of Spiking Neural Network},
author = {Gomez, Ismael; Tang, Guangzhi},
booktitle={2025 International Conference on Artificial Neural Networks (ICANN)},
pages={1--12},
year={2025}
}
- Ismael Gomez Garrido: gomezgarrido.ismael@gmail.com
- Guangzhi Tang: guangzhi.tang@maastrichtuniversity.nl
Acknowledgments: This publication is part of the project Brain-inspired MatMul-free Deep Learning for Sustainable AI on Neuromorphic Processor with file number NGF.1609.243.044 of the research programme AiNed XS Europe which is (partly) financed by the Dutch Research Council (NWO) under the grant https://doi.org/10.61686/MYMVX53467.