"Et tout d'un coup le souvenir m'est apparu." - Marcel Proust, Du côté de chez Swann
We consider recurrent neural networks comprising
Given one (or many)
Despite working in continuous time, without a clock, and with imprecise and noisy neurons, our networks can store and recall prescribed spike trains (i.e., memories) with high temporal stability. To the best of our knowledge, we are the first to explicitly demonstrate associative recall of memorized spike trains in continuous time.
-
Optionally, create and activate a virtual environment.
python -m venv rsnn source rsnn/bin/activate
or
conda create -n rsnn conda activate rsnn
-
Clone this repository.
git clone https://github.com/haguettaz/RSNN.git
-
Install the RSNN package and its dependencies.
python -m pip install -e RSNN
A tutorial on how to start using the package is accessible here. Click the binder badge to play with the notebook from your browser without installing anything.
- H. Aguettaz and H.-A. Loeliger, "Continuous-time neural networks can stably memorize random spike trains," arXiv:2408.01166, Aug. 2024.
- P. Murer, A New Perspective on Memorization in Recurrent Networks of Spiking Neurons. Ph.D. dissertation, No. 28166, ETH Zürich, 2022.
- P. Murer and H.-A. Loeliger, "Online memorization of random firing sequences by a recurrent neural network," 2020 IEEE International Symposium on Information Theory (ISIT), June 21-26, 2020.