The Neuromatch Academy was a summer school that happened in the summer of 2020. In the midst of a pandemic, the school emerged on an online platform reaching out to everyone across the world. The academy served the purpose of training neuroscientists to learn computational tools, make connections to real world neuroscience problems, and promote networking with researchers.
The academy also provided an opportunity for the students to carry out their own mini-project that would last three weeks. Several datasets were provided for exploration. Our group Decision Makers, from the pod 041-Gregarious-sambars have decided to go with the Steinmetz data set. Below us is a description of the Steinmetz data set that we chose.
During a decision making task like the one in the Steinmetz experiment, a number of different brain regions are involved in the processing of sensory information to decision making to motor action. These regions act together in particular networks.
Can we predict the rodent’s movement based on activity from different brain regions before or after the movement has occurred?
Our questions were inspired by some of the original questions from some exemplary projects.
We decided to focus on one rodent’s data, from one session (Session no. 11). Our chosen regions were the visual cortex, thalamus and the secondary motor area (MOs, MOp). These two are anatomically and functionally distinct, hence they were our best shot at investigating differences in activity.
We believe that the dataset has enough potential, and given enough time we can build better models to represent the temporal activities of regions. However for the time being we shall settle on the simpler task.
We have successfully completed the project. All the relevant details can be found in our supporting documentation file. which can be accessed here.
The dataset used for this purpose was the dataset procured by Steinmetz et al. 2019.
The Steinmetz dataset is an electrophysiological recording from multiple regions of the mouse brain during a 2-Alternative Forced Choice Task paradigm. Neuropixel probes were used to record from approx. 30,000 neurons from 42 regions, while the mouse performed a visual discrimination task. In each trial (multiple trials conducted over each session; and a total of 39 sessions), a mouse was placed on a wheel with its head fixed, surrounded by 3 screens (left, right and in front). Images of differential contrast were presented to either the left, right or both the screens and the mouse had to turn the wheel in the correct direction in order to bring the greater-contrast image to the front-screen. If there was no image presented on either side, the correct response was to hold the wheel steady for 1.5s. Neural activity was continuously recorded for the entire duration of the task.
Code for the analysis was written in Python, with the help of scientific packages; Numpy, Scipy, Sklearn, Neural_Decoding, Matplotlib
To load the data into our notebooks for further analysis, we used some code provided by Dr. Marius Pachitariu.
For a detailed description of the dataset see this document by Dr. Nick Steinmetz and this writeup by us.
Literature:
- Distributed coding of choice, action and engagement across the mouse brain
- A perceptual decision requires sensory but not action coding in mouse cortex
- Theoretical Neuroscience
- Perceptual Decision Making in Rodents, Monkeys, and Humans
Packages:
Code References:
Documentation: