Skip to content

summerfieldlab/Flesch_Nagy_etal_HebbCL

 
 

Repository files navigation

Code Repository for Flesch, Nagy et al: "Modelling continual learning in humans with Hebbian context gating and exponentially decaying task signals"

arXiv GitHub license GitHub issues GitHub pull-requests GitHub forks GitHub stars

The repo is work in progress, stay tuned!

Usage

To replicate results reported in the paper, clone this repository and install the required packages (preferably in a separate environment):

pip install -r requirements.txt

To re-run all simulations and collect several independent training runs, open a command window and run the following bash script:

./runner.sh

For individual runs, you can call the main.py file with command line arguments.
If you want to run your own hyperparameter optimisation, have a look at the HPOTuner class in hebbcl.tuner.

To replicate analyses and create figures, have a look at the paper_figures_scratchpad.ipynb notebook in the notebooks subfolder.

Preprint

For a preprint of this work, see https://arxiv.org/abs/2203.11560

Citation

If you'd like to cite this work, please use the following format:

@article{FleschNagyEtal2022,
  doi = {10.48550/ARXIV.2203.11560},
  
  url = {https://arxiv.org/abs/2203.11560},
  
  author = {Flesch, Timo and Nagy, David G. and Saxe, Andrew and Summerfield, Christopher},
  
  keywords = {Neurons and Cognition (q-bio.NC), Machine Learning (cs.LG), FOS: Biological sciences, FOS: Biological sciences, FOS: Computer and information sciences, FOS: Computer and information sciences},
  
  title = {Modelling continual learning in humans with Hebbian context gating and exponentially decaying task signals},
  
  publisher = {arXiv},
  
  year = {2022},
  month = {3},
  arxivId = {2203.11560}
  copyright = {Creative Commons Attribution 4.0 International}
}

About

Pytorch code for project on hebbian continual learning

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages

  • Jupyter Notebook 56.1%
  • Python 43.3%
  • Shell 0.6%