This repository provides code for computing graph metrics from spatial attention patterns across different models and brain functional networks.
Contains code for computing graph metrics from spatial attention patterns of various model families.
vit-process.pydinov3-process.py
gpt2-process.pybert-process.py
clip-language.py,clip-vision.pyblip-language.py,blip-vision.py
- LLMs with Rotary Position Embeddings
qwen-process.py
- LMMs with Rotary Position Embeddings
qwen-language-process.py,qwen-vision-process.pydeepseek-language-process.py
Before running any model processing scripts, modify these key parameters:
- Model List: Specify target list of model names in
models - Cache Directory: Set
your_cache_dirfor model downloads - Output Directory: Set
your_base_dirfor results storage - Access Tokens: For gated models, add your HuggingFace access token in the download section (request from official model pages)
For other and new RoPE-based LLM/LMM models, adapt existing examples by:
- Modifying the model loader
- Adjusting configuration extraction based on model architecture
- Updating Q/K weight extraction methods according to model nesting structure
Computes graph metrics for brain functional networks.
- Prepare multiple pre-extracted brain network connectivity matrices (
.matfiles) - Ensure main diagonal elements are set to zero in all matrices
model-result.zip: Graph metrics for 151 models used in the paper, please extract as'model result/'brainnetwork-result.json: Graph metrics for 7 group-level brain functional networks
Constructs the brain-like space and computes brain-likeness scores.
Use the provided model-result.zip and brainnetwork-result.json directories to:
- Visualize the brain-like space
- Generate brain-like score result files
- torch==2.8.0
- transformers==4.56.1
- accelerate==1.10.1
- safetensors==0.6.2
- timm==1.0.14
- networkx==3.4.2
- python-louvain==0.16
- numpy==2.2.6
- scipy==1.15.3
Note: transformers and accelerate versions may need adjustment for specific models. Some models require different versions for compatibility.
If you use this code in your research, please cite our accompanying paper:Situating and Comparing Artificial Neural Networks in a Unified Brain-like Space