PyTorch implementation of "Seeing the forest and the tree: Building representations of both individual and collective dynamics with transformers" (NeurIPS 2022).
Use synthetic_data_threebody.py or synthetic_data_twobody.py to generate synthetic data, and then use the corresponding synthetic_exp_EXP.py to run experiments. The models are defined in synthetic_exp_twobody.py.
The modified transformer basic operation sets are inside the my_transformers folder.
The datasets, models, and trainers are inside the neural_kits folder.
The scripts are inside main.py, where the flag MAEorVIT determines the experiment type. This codebase provides:
- The training of EIT,
- The training of the individual module of EIT,
- Transfer pre-trained EIT across different animals,
- Transfer pre-trained EIT across different targets,
- Transfer pre-trained EIT across different timepoints,
- Training a vanilla transformer supervisedly or self-supervisedly as the benchmark.
Where to find the training datasets: For the Mihi-Chewie datasets, you can checkout this. For the Jenkins’ Maze datasets, you can checkout this.
-
Ran Liu (Maintainer), github: ranliu98
-
Jingyun Xiao, github: jingyunx
-
Mehdi Azabou , github: mazabou
If you find the code useful for your research, please consider citing our work:
@inproceedings{liu2022seeing,
title={Seeing the forest and the tree: Building representations of both individual and collective dynamics with transformers},
author={Liu, Ran and Azabou, Mehdi and Dabagia, Max and Xiao, Jingyun and Dyer, Eva L},
journal={arXiv preprint arXiv:2206.06131},
year={2022}
}
AND Swap-VAE
@article{liu2021drop,
title={Drop, Swap, and Generate: A Self-Supervised Approach for Generating Neural Activity},
author={Liu, Ran and Azabou, Mehdi and Dabagia, Max and Lin, Chi-Heng and Gheshlaghi Azar, Mohammad and Hengen, Keith and Valko, Michal and Dyer, Eva},
journal={Advances in Neural Information Processing Systems},
volume={34},
pages={10587--10599},
year={2021}
}