Skip to content

The data and code of Saberi et al. 2023 PLOS Biology

Notifications You must be signed in to change notification settings

amnsbr/laminar_organization

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

Laminar thickness covariance in BigBrain

This repository includes the data and code associated with the paper "The regional variation of laminar thickness in the human isocortex is related to cortical hierarchy and inter-regional connectivity", Saberi et al 2023 PLOS Biology.

Docker

All the code, data and dependencies of this project (except dependencies of code/local) are available as a Docker image.

To run the Docker image, follow these steps:

  1. Install Docker on your computer (if not already installed).
  2. Pull the amnsbr/laminar_organization Docker image from Docker Hub by running the following command: docker pull amnsbr/laminar_organization
  3. Run the Docker container by running the following command: docker run -it -p 8888:8888 --privileged amnsbr/laminar_organization. This will start the Docker container, which opens a Jupyter Notebook instance that allows you to run the analyses. Note that the Jupyter will use port 8888, therefore this port must be free.

Note: The --privileged flag is needed for functions that require BigBrainWarp (e.g. human-macaque surface transformation). Please be aware that these functions may not work properly on Macs with M processors.

Repository structure

  • Source data is located in src/. The README in this folder lists a short description and the origin of each source file.
  • The analyses are done via files in code/
    • setup.sh creates the virtual environment and installs Python dependencies
    • run_figures.sh executes the Jupyter notebook files included in code/figures which will output the figures reported in the main text as well as the supplements. Note that many of these execute computationally-intensive code, and it is recommended to run them on high-performance computing clusters. Of course you can run the figure notebooks individually as well.
    • The notebooks mainly act as interfaces and use functions/classes defined in the Python files, which include:
      • datasets.py: Functions needed to load and preprocess the source data from src or external packages.
      • matrices.py: Includes a generic class Matrix with functions for matrix plotting and matrix-level associations. It also includes several classes for each type of the matrix used in this study, such as MicrostructuralCovarianceMatrix (which can create LTC, LDC, MPC and fused LTC-LDC matrices), ConnectivityMatrix (which loads structural, functional and effective connectivity matrices) and StructuralCovarianceMatrix.
      • surfaces.py: Includes generic classes CorticalSurface and its children ContCorticalSurface (continuous data) and CatCorticalSurface (categorical data), which include functions for surface plotting and surface-level associations. It also includes several classes for each specific type of the surface data used in this study, such as Gradients and MicrostructuralCovarianceGradients (which create the gradients), EffectiveConnectivityMaps (which creates asymmetry-based hierarchy map) and MacaqueHierarchy (which loads macaque laminar-based hierarchy map).
      • helpers.py: Includes helper functions for data manipulation (parcellation/deparcellation, down-/upsampling), statistical analysis (spin and variogram permutation), plotting surfaces/matrices, and transforms (e.g., human to macaque mapping).
    • local/ contains codes that cannot easily be included in the pipeline (run_figures.sh) because of their dependencies (e.g. Matlab, FreeSurfer or docker/singularity containers). However the output of these scripts are included in the src folder and used in the main script.
  • The output figures/statistics generated by the code are mainly displayed in the Jupyter notebooks. But part of the code also saves some ouput in the subfolder output (which is created by the script), which also includes several subfolders for each type of data.

External dependencies

The Python dependencies are installed by setup.sh, but in addition to those the main scripts have the following dependencies that need to be installed manually:

  • Conda: the version used here was 4.12.0
  • Connectome Workbench: the version used here was 1.5
  • BigBrainWarp as a singularity image (needed for the function surface_to_surface_transformation in helpers.py which is used for transformation of LTC G1 from bigbrain space to macaque space). This can be created by running singularity build bigbrainwarp.simg docker://caseypaquola/bigbrainwarp:latest.

In addition, each of the scripts in code/local have their own specific dependencies, including CIVET 2.1.1 as singularity image (for creating density profiles), Freesurfer 7.1 (for inflating bigbrain surface) and MATLAB R2021b (for downsampling BigBrain and projecting fsaverage annot files to MNI space).

Support

Feel free to contact me (amnsbr[at]gmail.com, a.saberi[at]fz-juelich.de) or open an issue if you have any questions or there are any problems with the code.

About

The data and code of Saberi et al. 2023 PLOS Biology

Resources

Stars

Watchers

Forks

Packages

No packages published

Languages