Skip to content
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
4 changes: 4 additions & 0 deletions .gitignore
Original file line number Diff line number Diff line change
Expand Up @@ -5,3 +5,7 @@
*.nii

opt.mat

# files in the demo folder related to running the demo analysis
demo/*.zip
demo/output/*
52 changes: 36 additions & 16 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -5,25 +5,30 @@

<!-- TOC -->

- [Instructions for SPM12 Preprocessing Pipeline](#instructions-for-spm12-preprocessing-pipeline)
- [Instructions for SPM12 Preprocessing Pipeline](#instructions-for-spm12-preprocessing-pipeline)
- [Dependencies](#dependencies)
- [General description](#general-description)
- [Assumption](#assumption)
- [Setting up](#setting-up)
- [getOptions](#getoptions)
- [model JSON files](#model-json-files)
- [Order of the analysis](#order-of-the-analysis)
- [Docker](#docker)
- [build docker image](#build-docker-image)
- [run docker image](#run-docker-image)
- [MRIQC](#mriqc)
- [Details about some steps](#details-about-some-steps)
- [Slice timing correction](#slice-timing-correction)
- [Boiler plate methods section](#boiler-plate-methods-section)
- [Preprocessing](#preprocessing)
- [fMRI data analysis](#fmri-data-analysis)
- [References](#references)
- [Testing](#testing)
- [Unit testing](#unit-testing)
- [Contributors ✨](#contributors-)

<!-- /TOC -->


## Dependencies

Make sure that the following toolboxes are installed and added to the matlab path.
Expand All @@ -38,6 +43,7 @@ For instructions see the following links:

For simplicity the NIfTI tools toolbox has been added to this repo in the `subfun` folder.


## General description

This set of function will read and unzip the data from a [BIDS data set](https://bids.neuroimaging.io/). It will then perform:
Expand All @@ -53,19 +59,21 @@ It can also prepare the data to run an MVPA analysis by running a GLM for each s

The core functions are in the sub-function folder `subfun`


## Assumption

At the moment this pipeline makes some assumptions:
- it assumes that the dummy scans have been removed from the BIDS data set and it can jump straight into pre-processing,
- it assumes the metadata for a given task are the same as those the first run of the first subject this pipeline is being run on,
- it assumes that group are defined in the subject field (eg `sub-ctrl01`, `sub-blind01`, ...) and not in the `participants.tsv` file.


## Setting up

### getOptions


All the details specific to your analysis should be set in the `getOptions.m`.
All the details specific to your analysis should be set in the `getOptions.m`. There is a getOption_template file that shows you would set up the getOption file if one wanted to analyse the [ds001 data set from OpenNeuro](https://openneuro.org/datasets/ds000001/versions/57fecb0ccce88d000ac17538).

Set the group of subjects to analyze.
```
Expand Down Expand Up @@ -102,6 +110,7 @@ The directory where your files are located on your computer: make sure you have

Some more SPM options can be set in the `spm_my_defaults.m`.


### model JSON files
This files allow you to specify which contrasts to run and follow the BIDS statistical model extension and as implement by [fitlins](https://fitlins.readthedocs.io/en/latest/model.html)

Expand Down Expand Up @@ -153,47 +162,51 @@ In brief this means:
- at the subject level automatically compute the t contrast against baseline for the condition `motion`and `static` and compute the t-contrats for motion VS static with these given weights.
- at the level of the data set (so RFX) do the t contrast of the `motion`, `static`, `motion VS static`.

We are currently using this to run different subject level GLM models for our univariate and multivariate analysis where in the first one we compute a con image that averages the beta image of all the runs where as in the latter case we get one con image for each run.


## Order of the analysis

1. __Remove Dummy Scans__:
Unzip bold files and removes dummy scans by running the script (to be run even if `opt.numDummies` set to `0`):
`BIDS_rmDummies.m`
Unzip bold files and removes dummy scans by running the script (to be run even if `opt.numDummies` set to `0`): `BIDS_rmDummies.m`

2. __Slice Time Correction__: Performs Slice Time Correction (STC) of the functional volumes by running the script: `BIDS_STC.m`

2. __Slice Time Correction__: Performs Slice Time Correction (STC) of the functional volumes by running the script:
`BIDS_STC.m`
STC will be performed using the information provided in the BIDS data set. It will use the mid-volume acquisition time point as as reference.

The `getOption.m` fields related to STC can still be used to do some slice timing correction even no information is can be found in the BIDS data set.

In general slice order and reference slice is entered in time unit (ms) (this is the BIDS way of doing things) instead of the slice index of the reference slice (the "SPM" way of doing things).

More info available on this page of the [SPM wikibook](https://en.wikibooks.org/wiki/SPM/Slice_Timing).

3. __Spatial Preprocessing__:
Performs spatial preprocessing by running the script:
`BIDS_SpatialPrepro.m`
Performs spatial preprocessing by running the script: `BIDS_SpatialPrepro.m`

4. __SMOOTHING__:
Performs smoothing of the functional data by running the script:
`BIDS_Smoothing.m`
Performs smoothing of the functional data by running the script: `BIDS_Smoothing.m`

5. __FIXED EFFECTS ANALYSIS (FIRST-LEVEL ANALYSIS)__:
Performs the fixed effects analysis by running the ffx script:
`BIDS_FFX.m`
Performs the fixed effects analysis by running the ffx script: `BIDS_FFX.m`

This will run twice, once for model specification and another time for model estimation. See the function for more details.

This will take each condition present in the `events.tsv` file of each run and convolve it with a canonical HRF. It will also add the 6 realignment parameters of every run as confound regressors.

6. __RANDOM EFFECTS ANALYSIS (SECOND-LEVEL ANALYSIS)__:
Performs the random effects analysis by running the RFX script:
`BIDS_RFX.m`
Performs the random effects analysis by running the RFX script: `BIDS_RFX.m`

7. __GET THE RESULTS FROM A SPECIFIC CONTRAST__: `BIDS_Results.m`

- See __"batch.m"__ for examples and for the order of the scripts.
- See __"batch_dowload_run.m"__ for an example of how to download a data set and analyze it all in one go.


## Docker

The recipe to build the docker image is in the `Dockerfile`


### build docker image

To build the image with with octave and SPM the `Dockerfile` just type :
Expand All @@ -202,6 +215,7 @@ To build the image with with octave and SPM the `Dockerfile` just type :

This will create an image with the tag name `cpp_spm_octave:0.0.1`


### run docker image

The following code would start the docker image and would map 2 folders one for `output` and one for `code` you want to run.
Expand Down Expand Up @@ -231,6 +245,7 @@ docker run -it --rm -v $data_dir/raw:/data:ro -v $data_dir:/out poldracklab/mriq

## Details about some steps


### Slice timing correction

BELOW: some comments from [here](http://mindhive.mit.edu/node/109) on STC, when it should be applied
Expand All @@ -243,8 +258,10 @@ _If you do slice timing correction before realignment, you might look down your

_There's no way to avoid all the error (short of doing a four-dimensional realignment process combining spatial and temporal correction - Remi's note: fMRIprep does it), but I believe the current thinking is that doing slice timing first minimizes your possible error. The set of voxels subject to such an interpolation error is small, and the interpolation into another TR will also be small and will only affect a few TRs in the time course. By contrast, if one realigns first, many voxels in a slice could be affected at once, and their whole time courses will be affected. I think that's why it makes sense to do slice timing first. That said, here's some articles from the SPM e-mail list that comment helpfully on this subject both ways, and there are even more if you do a search for "slice timing AND before" in the archives of the list._


## Boiler plate methods section


### Preprocessing

The fMRI data were pre-processed and analyzed using statistical parametric mapping (SPM12 – v7487; Wellcome Center for Neuroimaging, London, UK; www.fil.ion.ucl.ac.uk/spm) running on {octave 4.{??} / matlab 20{XX} (Mathworks)}.
Expand All @@ -263,6 +280,7 @@ The anatomical T1 image was bias field corrected, segmented and normalized to MN

Functional MNI normalized images were then spatially smoothed using a 3D gaussian kernel (FWHM = {XX} mm).


### fMRI data analysis

At the subject level, we performed a mass univariate analysis with a linear regression at each voxel of the brain, using generalized least squares with a global FAST model to account for temporal auto-correlation (Corbin et al, 2018) and a drift fit with discrete cosine transform basis (128 seconds cut-off). Image intensity scaling was done run-wide before statistical modeling such that the mean image will have mean intracerebral intensity of 100.
Expand All @@ -279,17 +297,19 @@ Table of constrast with weight: WIP

Group level: WIP


### References

Friston KJ, Ashburner J, Frith CD, Poline J-B, Heather JD & Frackowiak RSJ (1995) Spatial registration and normalization of images Hum. Brain Map. 2:165-189

Corbin, N., Todd, N., Friston, K. J. & Callaghan, M. F. Accurate modeling of temporal correlations in rapidly sampled fMRI time series. Hum. Brain Mapp. 39, 3884–3897 (2018).


## Testing
## Unit testing

All tests are in the test folder. There is also an empty dummy BIDS dataset that is partly created using the bash script `createDummyDataSet.sh`.


## Contributors ✨

Thanks goes to these wonderful people ([emoji key](https://allcontributors.org/docs/en/emoji-key)):
Expand Down
2 changes: 2 additions & 0 deletions batch.m
Original file line number Diff line number Diff line change
Expand Up @@ -33,6 +33,8 @@
BIDS_RFX(1, 6, 6)
BIDS_RFX(2, 6, 6)

BIDS_Results(6, 6, opt, 0)

% subject level multivariate
isMVPA=1;
BIDS_FFX(1, 6, opt, isMVPA);
Expand Down
24 changes: 24 additions & 0 deletions demo/batch_download_run.m
Original file line number Diff line number Diff line change
Expand Up @@ -46,6 +46,29 @@
opt.model.univariate.file = fullfile(WD, 'model-MoAE_smdl.json');


% specify the result to compute
opt.result.Steps(1) = struct(...
'Level', 'subject', ...
'Contrasts', struct(...
'Name', 'listening', ... % has to match
'Mask', false, ... % this might need improving if a mask is required
'MC', 'FWE', ... FWE, none, FDR
'p', 0.05, ...
'k', 0, ...
'NIDM', true) );

opt.result.Steps(1).Contrasts(2) = struct(...
'Name', 'listening_inf_baseline', ...
'Mask', false, ...
'MC', 'none', ... FWE, none, FDR
'p', 0.01, ...
'k', 0, ...
'NIDM', true);





%% Get data
fprintf('%-40s:', 'Downloading dataset...');
urlwrite(URL, 'MoAEpilot.zip');
Expand All @@ -72,5 +95,6 @@
BIDS_Smoothing(FWHM, opt);
BIDS_FFX(1, FWHM, opt, 0);
BIDS_FFX(2, FWHM, opt, 0);
BIDS_Results(FWHM, opt, 0)


4 changes: 4 additions & 0 deletions demo/model-MoAE_smdl.json
Original file line number Diff line number Diff line change
Expand Up @@ -5,6 +5,10 @@
"task": "auditory"
},
"Steps": [
{
"Level": "dataset",
"AutoContrasts": []
},
{
"Level": "subject",
"AutoContrasts": ["trial_type.listening"],
Expand Down
43 changes: 18 additions & 25 deletions getOption.m
Original file line number Diff line number Diff line change
Expand Up @@ -7,30 +7,18 @@
end

% group of subjects to analyze
opt.groups = {''}; % {'blnd', 'ctrl'};
opt.groups = {''};
% suject to run in each group
% opt.subjects = {[4:6]}; % {[1:2], [1:2]};
opt.subjects = {[]}; % {[1:2], [1:2]};
opt.subjects = {[1:2]};


% task to analyze
opt.taskName = 'MotionDecoding';

% opt.taskName = 'visMotion';

% opt.taskName = 'balloonanalogrisktask';



% The directory where the data are located

% opt.dataDir = '/Users/mohamed/Desktop/MotionWorkshop/raw';
% opt.dataDir = '/Users/mohamed/Desktop/Data/raw';

% opt.dataDir = '/home/remi/BIDS/visMotion/raw';
opt.dataDir = '/home/remi/BIDS/MotionDecoding/raw';

% opt.dataDir = '/home/remi/BIDS/ds001/rawdata';
opt.dataDir = '/Users/mohamed/Desktop/MotionWorkshop/raw';
opt.dataDir = '/Users/mohamed/Desktop/Data/raw';


% Options for slice time correction
Expand All @@ -51,22 +39,27 @@
% Suffix output directory for the saved jobs
opt.JOBS_dir = fullfile(opt.dataDir, '..', 'derivatives', 'SPM12_CPPL', 'JOBS', opt.taskName);

% specify the model file that contains the contrasts to compute

% opt.model.univariate.file = '/Users/mohamed/Documents/GitHub/BIDS_fMRI_scripts/model-motionDecodingUnivariate_smdl.json';
% opt.model.multivariate.file = '/Users/mohamed/Documents/GitHub/BIDS_fMRI_scripts/model-motionDecodingMultivariate_smdl.json';

% opt.model.univariate.file = '/home/remi/github/CPP_BIDS_SPM_pipeline/model-visMotionLoc_smdl.json';
% specify the model file that contains the contrasts to compute
opt.model.univariate.file = '/Users/mohamed/Documents/GitHub/BIDS_fMRI_scripts/model-motionDecodingUnivariate_smdl.json';
opt.model.multivariate.file = '/Users/mohamed/Documents/GitHub/BIDS_fMRI_scripts/model-motionDecodingMultivariate_smdl.json';

opt.model.univariate.file = '/home/remi/github/CPP_BIDS_SPM_pipeline/model-motionDecodingUnivariate_smdl.json';
opt.model.multivariate.file = '/home/remi/github/CPP_BIDS_SPM_pipeline/model-motionDecodingMultivariate_smdl.json';

% opt.model.univariate.file = '/home/remi/github/CPP_BIDS_SPM_pipeline/model-balloonanalogriskUnivariate_smdl.json';
% opt.model.multivariate.file = '/home/remi/github/CPP_BIDS_SPM_pipeline/model-balloonanalogriskMultivariate_smdl.json';
% specify the result to compute
opt.result.Steps(1) = struct(...
'Level', 'dataset', ...
'Contrasts', struct(...
'Name', 'Vis_U', ... % has to match one of the contrast defined in the model json file
'Mask', false, ... % this might need improving if a mask is required
'MC', 'none', ... FWE, none, FDR
'p', 0.05, ...
'k', 0, ...
'NIDM', true) );


% Save the opt variable as a mat file to load directly in the preprocessing
% scripts
save('opt.mat','opt')


end
26 changes: 20 additions & 6 deletions getOption_template
Original file line number Diff line number Diff line change
Expand Up @@ -7,17 +7,18 @@ if nargin<1
end

% group of subjects to analyze
opt.groups = {''}; % {'blnd', 'ctrl'};
opt.groups = {''};
% suject to run in each group
opt.subjects = {[]}; % {[1:2], [1:2]};
opt.subjects = {[]};


% task to analyze
opt.taskName = 'visMotion';
opt.taskName = 'balloonanalogrisktask';


% The directory where the derivatives are located
opt.dataDir = '/home/remi/BIDS/visMotion/raw';

% The directory where the data are located
opt.dataDir = '/home/remi/BIDS/ds001/rawdata';


% Options for slice time correction
Expand All @@ -39,7 +40,20 @@ opt.funcVoxelDims = [];
opt.JOBS_dir = fullfile(opt.dataDir, '..', 'derivatives', 'SPM12_CPPL', 'JOBS', opt.taskName);

% specify the model file that contains the contrasts to compute
opt.model.univariate.file = '/home/remi/github/CPP_BIDS_SPM_pipeline/model-visMotionLoc_smdl.json';
opt.model.univariate.file = fullfile(fileparts(mfilename('fullpath')), 'model', model-balloonanalogriskUnivariate_smdl.json');
opt.model.multivariate.file = fullfile(fileparts(mfilename('fullpath')), 'model', model-balloonanalogriskMultivariate_smdl.json');


% specify the result to compute
opt.result.Steps(1) = struct(...
'Level', 'dataset', ...
'Contrasts', struct(...
'Name', 'pumps_demean', ... % has to match
'Mask', false, ... % this might need improving if a mask is required
'MC', 'none', ... FWE, none, FDR
'p', 0.05, ...
'k', 0, ...
'NIDM', true) );


% Save the opt variable as a mat file to load directly in the preprocessing
Expand Down
22 changes: 22 additions & 0 deletions model/model-balloonanalogriskMultivariate_smdl.json
Original file line number Diff line number Diff line change
@@ -0,0 +1,22 @@
{
"Name": "balloonanalogrisk",
"Description": "contrasts for the balloonanalogrisk dataset",
"Input": {
"task": "balloonanalogrisktask"
},
"Steps": [
{
"Level": "subject",
"AutoContrasts": [],
"Contrasts": []
},
{
"Level": "run",
"AutoContrasts": ["trial_type.pumps_demean"]
},
{
"Level": "dataset",
"AutoContrasts": []
}
]
}
Loading