Skip to content

An open-source library designed for the evaluation of Spiking Neural Networks (SNNs).

License

Notifications You must be signed in to change notification settings

Dengyu-Wu/snncutoff

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

Text changing depending on mode. Light: 'So light!' Dark: 'So dark!'

SNNCutoff is a Python package developed with a PyTorch backend, designed primarily for evaluating Spiking Neural Networks (SNNs). It offers:

  • SNN Evaluation:

    • Utilizing detailed performance metrics, e.g., accuracy, latency and operations.
    • Capabilities for conducting adaptive inference or cutoff of SNNs.
  • SNN Training:

    • While the emphasis is on evaluation, the toolkit also supports a diverse array of training algorithms.

Overview

  • SNN Training Algorithms:

  • A New Metric:

    • Optimal Cutoff Timestep (OCT): A optimal timestep that determines the minimal input processing duration for maintaining predictive reliability in SNNs. OCT is grounded in theoretical analysis and serves as a robust benchmark for assessing SNN models under different optimization algorithms.
  • Cutoff Approximation:

    • Timestep (Baseline): Cutoff triggered using fixed timestep.
    • Top-K: Cutoff triggered using the gap between the top-1 and top-2 output predictions at each timestep.
    • Others: Coming soon.

More details in Documentation.

Getting Started

To begin using SNNCutoff, clone this repository and follow the setup instructions below.

Installation

  1. Clone the repo
git clone https://github.com/Dengyu-Wu/snncutoff.git
  1. Install Pytorch
pip install -r requirements. txt 

Training and Evaluation

We provide training and evaluation examples in scripts.

Contributing

Check the contributing guidelines if you want to get involved with developing SNNCutoff.

Acknowledgments

We extend our appreciation to everyone who has contributed to the development of this project, both directly and indirectly.