Skip to content

Project for Neuromatch Academy: Computational Neuroscience

License

Notifications You must be signed in to change notification settings

Agrover112/nma-cn-project

 
 

Repository files navigation

Does engagement matter: Do mice see the world differently when they don't care?

The following repository contains code for the Project for Neuromatch Academy: Computational Neuroscience.

Traditionally V1/VISp is considered a simple feature detector. Stimulus representation in A1 is known to adapt to engagement in a go/no-go task. Does stimulus representation in V1 also differ depending on engagement (active) and disengagement (passive)? We investigate this question both cell-level and population-level.

Requirements

Required Python packages are listed in the requirements.txt file. The dataset can be accessed from the following allen-sdk api.

Results

  • On single cell level our methods yield no clear results.
  • Population level dissimilarity (RDM) hints at the difference between active and passive session, but with control (pre-stimuli) this difference appears independent of the stimuli.

Refer to the respective notebooks for single cell and population level analysis. For detailed results, refer to the presentation file.

References

About

Project for Neuromatch Academy: Computational Neuroscience

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

 
 
 

Languages

  • Jupyter Notebook 100.0%