Skip to content

GuanLab/Sensorium2022_Challenge

Repository files navigation

Predicting and Interpreting the single neuron response of the visual cortex via Deep Learning Model

This is the package of our winning solution in the SENSORIUM 2022 Challenge. Details of this challenge can be found in this paper. Please contact (dengkw@umich.edu or gyuanfan@umich.edu) if you have any questions or suggestions.

Overview of the data and experiment design Figure1

Overview of the methods Figure2


Installations

Git clone a copy of code:

git clone https://github.com/GuanLab/Sensorium2022_Challenge.git

Setup the running environment through conda/mamba

We provided the commands line-by-line in setup_environment.sh for creating the running environments

Install the YOLOv5

# you may need to ensure the directory name is "yolov5"
git clone https://github.com/ultralytics/yolov5.git

Install the data under dataset directory: https://gin.g-node.org/cajal/Sensorium2022

The pretrained weights can be retrieved from google drive. Save them under the sensorium/model_checkpoints folder

Build the model on the challenge data

Data pre-processing

After downloading and unzipping the challenge data, follow the scripts in 0_process_data.ipynb to label the bounding boxes and generate different train-validation splits for ensemble. The yolov5l.pt is the official pretrained weights downloaded here, and the yolo-finetune.pt is our fine-tuned weights on ILSVRC2017.

Train and evaluate model

Follow the scripts in sensorium/1_train_evaluate_submit.ipynb, you will be able to train and evaluate the model on the challenge data, and repeat the performance reported in our paper.

  1. Training

    # You may want to change visible gpus in this script.
    bash run.sh
  2. Predict and evaluate

    CUDA_VISIBLE_DEVICES=0 python predict.py
    
    # get the performance for each neuron from N models
    # for example: ensemble from 5 models
    CUDA_VISIBLE_DEVICES=0 python predict_per_neuron.py 5 
  3. Generate the predictions and corresponding responses (the ground-truths) for analyzing

    CUDA_VISIBLE_DEVICES=0 python submit.py

Analyze the predictions (optional)

We provide the scripts in analyze to repeat our results and some of the figures in the paper. They include extracting the image properties (complexity, brightness, contrast) inspect_model_with_image.ipynb, analyzing the spatial properties grid_experiment.ipynb, estimate the artificial receptive fields (aRFs) estimate_aRF.py and plot plot_aRF.ipynb, and visualize the retinotopic maps retinotopic_map.ipynb

Reference

https://github.com/sinzlab/neuralpredictors

https://github.com/sinzlab/sensorium

https://github.com/bryanlimy/V1T

About

The winning solution for the SENSORIUM 2022 Challenge

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published