Generative working memory in Transformer decoder
-
Updated
Jun 26, 2024 - Python
Generative working memory in Transformer decoder
This repository contains all codes necessary to reproduce figures and results reported in Stein, Barbosa et al. (Nature Communications, 2020) from the raw data acquired in human behavioral experiments (data included in the repository), and from the relevant model simulations.
A unified working & short-term memory task for artificial neural networks.
Visual Working Memory Game in Python which gives you the gameplay data
A setup to evaluate working memory using EEG signals
Spiking neuronal network simulations (Python, NEST Simulator) for continuous attractor working memory networks with short-term plasticity.
Personalized Training for the Sequence Learning task with the NAO robot and the MUSE EEG sensor
Add a description, image, and links to the working-memory topic page so that developers can more easily learn about it.
To associate your repository with the working-memory topic, visit your repo's landing page and select "manage topics."