Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[ENH] New reports #198

Merged
merged 29 commits into from
Oct 21, 2016
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
Show all changes
29 commits
Select commit Hold shift + click to select a range
76b6c77
add notebooks
oesteban Oct 7, 2016
48cdf3f
start with new spikes interface
oesteban Oct 7, 2016
7e47df7
add new plots, move them to codebase
oesteban Oct 8, 2016
f9be8c3
new fMRIPlot class
oesteban Oct 9, 2016
e8b68ae
add gitignore in notebooks folder
oesteban Oct 9, 2016
5414a50
add juypiter checkpoint folder to gitignore
oesteban Oct 9, 2016
7056c6d
integrated new plots into workflow
oesteban Oct 9, 2016
d89887c
integrating MNI-EPI fast registration
oesteban Oct 9, 2016
013a2e8
integrated EPI-MNI registration, colorbar of carpetplot
oesteban Oct 9, 2016
3889d6a
move spikes mask, fft-based spike detection
oesteban Oct 10, 2016
ef44a2c
add new plots / fft computations
oesteban Oct 11, 2016
17e4f54
finishing plots for the QA lecture
oesteban Oct 11, 2016
7492868
Merge remote-tracking branch 'upstream/master' into enh/NewPlots
oesteban Oct 11, 2016
628c83d
remove mri_reorient, fix FD plot units
oesteban Oct 11, 2016
ea6cf55
fix circle.yml syntax
oesteban Oct 11, 2016
c0f268f
Merge branch 'master' into enh/NewPlots
oesteban Oct 12, 2016
5228d85
fix error in circle.yml
oesteban Oct 12, 2016
98aeb3c
don't demean the spike traces
chrisgorgo Oct 16, 2016
22c5df1
use only side slices instead of relying on the brain mask to define t…
chrisgorgo Oct 16, 2016
099bf1a
visual improvements for the spikes plot
chrisgorgo Oct 16, 2016
c2c2ef8
visual improvements for the spikes plot
chrisgorgo Oct 16, 2016
359f4cf
further visual improvements for the fMRI plot
chrisgorgo Oct 16, 2016
bc55efc
Merge branch 'enh/gs_pdf_merging' into enh/NewPlots
chrisgorgo Oct 16, 2016
6b87793
Merge pull request #1 from chrisfilo/enh/NewPlots
oesteban Oct 16, 2016
4dabbdc
Merge branch 'master' into enh/NewPlots
oesteban Oct 21, 2016
03f26b5
Integrate new parcellation into the carpetplot
oesteban Oct 21, 2016
a4f26df
remove fft spikes finder
oesteban Oct 21, 2016
417fe29
pylint fixes
oesteban Oct 21, 2016
67854c1
Finishing up #198
oesteban Oct 21, 2016
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
6 changes: 6 additions & 0 deletions CHANGES.txt
Original file line number Diff line number Diff line change
@@ -1,3 +1,9 @@
Release 0.8.7
=============

* [ENH] New report layout for fMRI, added carpetplot (#198)
* [ENH] Anatomical workflow refactor (#219).

Release 0.8.6
=============

Expand Down
5 changes: 3 additions & 2 deletions circle.yml
Original file line number Diff line number Diff line change
Expand Up @@ -20,7 +20,7 @@ dependencies:
# Create scratch folder and force group permissions
- mkdir -p $SCRATCH && sudo setfacl -d -m group:ubuntu:rwx $SCRATCH && sudo setfacl -m group:ubuntu:rwx $SCRATCH
- if [[ ! -d ~/data/ds003_downsampled ]]; then wget --retry-connrefused --waitretry=5 --read-timeout=20 --timeout=15 -t 0 -q -O ds003_downsampled.tar.gz "${DS003_URL}" && tar xzf ds003_downsampled.tar.gz -C ~/data/; fi
- pip install "niworkflows>=0.0.3a14"
- pip install "niworkflows>=0.0.3a15"
override:
- python -c "from niworkflows import data as nwd; nwd.get_mni_icbm152_nlin_asym_09c(); nwd.get_ds003_downsampled(); nwd.get_brainweb_1mm_normal()"
- if [[ -e ~/docker/image.tar ]]; then docker load -i ~/docker/image.tar; fi
Expand All @@ -33,7 +33,8 @@ test:
# Test mriqcp
- docker run -i -v /etc/localtime:/etc/localtime:ro -v ~/.cache/stanford-crn:/root/.cache/stanford-crn -v ${CIRCLE_TEST_REPORTS}:/scratch -w /root/src/mriqc --entrypoint="/usr/bin/run_tests" mriqc:py35 :
timeout: 2600
- docker run -i -v /etc/localtime:/etc/localtime:ro -v ~/.cache/stanford-crn:/root/.cache/stanford-crn -v ~/data:/data:ro -v $SCRATCH/func:/scratch -w /scratch mriqc:py35 /data/ds003_downsampled out/ participant -d func -w work/ --nthreads ${FUNC_NPROCS} --testing
- docker run -i -v /etc/localtime:/etc/localtime:ro -v ~/.cache/stanford-crn:/root/.cache/stanford-crn -v ~/data:/data:ro -v $SCRATCH/func:/scratch -w /scratch mriqc:py35 /data/ds003_downsampled out/ participant -d func -w work/ --nthreads ${FUNC_NPROCS} --testing :
timeout: 2600
- docker run -i -v /etc/localtime:/etc/localtime:ro -v ~/data:/data:ro -v $SCRATCH/func:/scratch -w /scratch mriqc:py35 /data/ds003_downsampled out/ group -d func -w work/
- docker run -i -v /etc/localtime:/etc/localtime:ro -v ~/.cache/stanford-crn:/root/.cache/stanford-crn -v ~/data:/data:ro -v $SCRATCH/anat:/scratch -w /scratch mriqc:py35 /data/ds003_downsampled out/ participant -d anat -w work/ --nthreads ${ANAT_NPROCS} --testing --ants-nthreads ${ANTS_NTHREADS} --verbose-reports :
timeout: 2600
Expand Down
8 changes: 5 additions & 3 deletions mriqc/info.py
Original file line number Diff line number Diff line change
Expand Up @@ -9,7 +9,7 @@
import sys

__versionbase__ = '0.8.7'
__versionrev__ = 'a0'
__versionrev__ = 'a1'
__version__ = __versionbase__ + __versionrev__
__author__ = 'Oscar Esteban'
__email__ = 'code@oscaresteban.es'
Expand Down Expand Up @@ -52,7 +52,7 @@
'six',
'matplotlib',
'nibabel',
'niworkflows>=0.0.3a14',
'niworkflows>=0.0.3a15',
'pandas',
'dipy',
'jinja2',
Expand All @@ -66,6 +66,7 @@
'svgutils',
'nipype',
'rst2pdf',
'nipy'
]

LINKS_REQUIRES = [
Expand All @@ -84,7 +85,8 @@
EXTRA_REQUIRES = {
'doc': ['sphinx'],
'tests': TESTS_REQUIRES,
'duecredit': ['duecredit']
'duecredit': ['duecredit'],
'notebooks': ['ipython', 'jupyter']
}

# Enable a handle to install all extra dependencies at once
Expand Down
7 changes: 4 additions & 3 deletions mriqc/interfaces/anatomical.py
Original file line number Diff line number Diff line change
Expand Up @@ -7,7 +7,7 @@
# @Date: 2016-01-05 11:29:40
# @Email: code@oscaresteban.es
# @Last modified by: oesteban
# @Last Modified time: 2016-09-19 18:03:07
# @Last Modified time: 2016-10-07 14:37:52
""" Nipype interfaces to support anatomical workflow """
from __future__ import print_function
from __future__ import division
Expand Down Expand Up @@ -44,6 +44,9 @@ def __init__(self, **inputs):
self._results = {}
super(ArtifactMask, self).__init__(**inputs)

def _list_outputs(self):
return self._results

def _run_interface(self, runtime):
imnii = nb.load(self.inputs.in_file)
imdata = np.nan_to_num(imnii.get_data().astype(np.float32))
Expand Down Expand Up @@ -87,8 +90,6 @@ def _run_interface(self, runtime):
self._results['out_air_msk'])
return runtime

def _list_outputs(self):
return self._results

def artifact_mask(imdata, airdata, distance):
"""Computes a mask of artifacts found in the air region"""
Expand Down
22 changes: 22 additions & 0 deletions mriqc/interfaces/base.py
Original file line number Diff line number Diff line change
@@ -0,0 +1,22 @@
#!/usr/bin/env python
# -*- coding: utf-8 -*-
# emacs: -*- mode: python; py-indent-offset: 4; indent-tabs-mode: nil -*-
# vi: set ft=python sts=4 ts=4 sw=4 et:

from __future__ import print_function, division, absolute_import, unicode_literals

from nipype.interfaces.base import BaseInterface


class MRIQCBaseInterface(BaseInterface):
"""
Adds the _results property and implements _list_outputs

"""

def __init__(self, **inputs):
self._results = {}
super(MRIQCBaseInterface, self).__init__(**inputs)

def _list_outputs(self):
return self._results
153 changes: 153 additions & 0 deletions mriqc/interfaces/functional.py
Original file line number Diff line number Diff line change
@@ -0,0 +1,153 @@
#!/usr/bin/env python
# -*- coding: utf-8 -*-
# emacs: -*- mode: python; py-indent-offset: 4; indent-tabs-mode: nil -*-
# vi: set ft=python sts=4 ts=4 sw=4 et:

from __future__ import print_function, division, absolute_import, unicode_literals
from os import path as op
import numpy as np
import nibabel as nb

from .base import MRIQCBaseInterface
from nipype.interfaces.base import traits, TraitedSpec, BaseInterfaceInputSpec, File, isdefined
from nilearn.signal import clean
from scipy.stats.mstats import zscore


class SpikesInputSpec(BaseInterfaceInputSpec):
in_file = File(exists=True, mandatory=True, desc='input fMRI dataset')
in_mask = File(exists=True, desc='brain mask')
invert_mask = traits.Bool(False, usedefault=True, desc='invert mask')
no_zscore = traits.Bool(False, usedefault=True, desc='do not zscore')
detrend = traits.Bool(True, usedefault=True, desc='do detrend')
spike_thresh = traits.Float(6., usedefault=True,
desc='z-score to call one timepoint of one axial slice a spike')
skip_frames = traits.Int(0, usedefault=True,
desc='number of frames to skip in the beginning of the time series')
out_tsz = File('spikes_tsz.txt', usedefault=True, desc='output file name')
out_spikes = File(
'spikes_idx.txt', usedefault=True, desc='output file name')


class SpikesOutputSpec(TraitedSpec):
out_tsz = File(
desc='slice-wise z-scored timeseries (Z x N), inside brainmask')
out_spikes = File(desc='indices of spikes')
num_spikes = traits.Int(desc='number of spikes found (total)')


class Spikes(MRIQCBaseInterface):

"""
Computes the number of spikes
https://github.com/cni/nims/blob/master/nimsproc/qa_report.py

"""
input_spec = SpikesInputSpec
output_spec = SpikesOutputSpec

def _run_interface(self, runtime):
func_nii = nb.load(self.inputs.in_file)
func_data = func_nii.get_data()
func_shape = func_data.shape
ntsteps = func_shape[-1]
tr = func_nii.get_header().get_zooms()[-1]
nskip = self.inputs.skip_frames

if self.inputs.detrend:
data = func_data.reshape(-1, ntsteps)
clean_data = clean(data[:, nskip:].T, t_r=tr, standardize=False).T
new_shape = (
func_shape[0], func_shape[1], func_shape[2], clean_data.shape[-1])
func_data = np.zeros(func_shape)
func_data[..., nskip:] = clean_data.reshape(new_shape)

if not isdefined(self.inputs.in_mask):
_, mask_data, _ = auto_mask(
func_data, nskip=self.inputs.skip_frames)
else:
mask_data = nb.load(self.inputs.in_mask).get_data()
mask_data[..., :nskip] = 0
mask_data = np.stack([mask_data] * ntsteps, axis=-1)

if not self.inputs.invert_mask:
brain = np.ma.array(func_data, mask=(mask_data != 1))
else:
mask_data[..., :self.inputs.skip_frames] = 1
brain = np.ma.array(func_data, mask=(mask_data == 1))

if self.inputs.no_zscore:
ts_z = find_peaks(brain)
total_spikes = []
else:
total_spikes, ts_z = find_spikes(
brain, self.inputs.spike_thresh)
total_spikes = list(set(total_spikes))

out_tsz = op.abspath(self.inputs.out_tsz)
self._results['out_tsz'] = out_tsz
np.savetxt(out_tsz, ts_z)

out_spikes = op.abspath(self.inputs.out_spikes)
self._results['out_spikes'] = out_spikes
np.savetxt(out_spikes, total_spikes)
self._results['num_spikes'] = len(total_spikes)
return runtime

def find_peaks(data):
t_z = [data[:, :, i, :].mean(axis=0).mean(axis=0) for i in range(data.shape[2])]
return t_z

def find_spikes(data, spike_thresh):
data -= np.median(np.median(np.median(data, axis=0), axis=0), axis=0)
slice_mean = np.median(np.median(data, axis=0), axis=0)
t_z = _robust_zscore(slice_mean)
spikes = np.abs(t_z) > spike_thresh
spike_inds = np.transpose(spikes.nonzero())
# mask out the spikes and recompute z-scores using variance uncontaminated with spikes.
# This will catch smaller spikes that may have been swamped by big
# ones.
data.mask[:, :, spike_inds[:, 0], spike_inds[:, 1]] = True
slice_mean2 = np.median(np.median(data, axis=0), axis=0)
t_z = _robust_zscore(slice_mean2)

spikes = np.logical_or(spikes, np.abs(t_z) > spike_thresh)
spike_inds = [tuple(i) for i in np.transpose(spikes.nonzero())]
return spike_inds, t_z


def auto_mask(data, raw_d=None, nskip=3, mask_bad_end_vols=False):
from dipy.segment.mask import median_otsu
mn = data[:, :, :, nskip:].mean(3)
masked_data, mask = median_otsu(mn, 3, 2)
mask = np.concatenate((
np.tile(True, (data.shape[0], data.shape[1], data.shape[2], nskip)),
np.tile(np.expand_dims(mask == 0, 3), (1, 1, 1, data.shape[3]-nskip))),
axis=3)
mask_vols = np.zeros((mask.shape[-1]), dtype=int)
if mask_bad_end_vols:
# Some runs have corrupt volumes at the end (e.g., mux scans that are stopped prematurely). Mask those too.
# But... motion correction might have interpolated the empty slices such that they aren't exactly zero.
# So use the raw data to find these bad volumes.
# 2015.10.29 RFD: this caused problems with some non-mux EPI scans that (inexplicably)
# have empty slices at the top of the brain. So we'll disable it for
# now.
if raw_d is None:
slice_max = data.max(0).max(0)
else:
slice_max = raw_d.max(0).max(0)

bad = np.any(slice_max == 0, axis=0)
# We don't want to miss a bad volume somewhere in the middle, as that could be a valid artifact.
# So, only mask bad vols that are contiguous to the end.
mask_vols = np.array([np.all(bad[i:]) for i in range(bad.shape[0])])
# Mask out the skip volumes at the beginning
mask_vols[0:nskip] = True
mask[..., mask_vols] = True
brain = np.ma.masked_array(data, mask=mask)
good_vols = np.logical_not(mask_vols)
return brain, mask, good_vols

def _robust_zscore(data):
return ((data - np.atleast_2d(np.median(data, axis=1)).T) /
np.atleast_2d(data.std(axis=1)).T)
7 changes: 7 additions & 0 deletions mriqc/viz/__init__.py
Original file line number Diff line number Diff line change
@@ -0,0 +1,7 @@
#!/usr/bin/env python
# -*- coding: utf-8 -*-
# emacs: -*- mode: python; py-indent-offset: 4; indent-tabs-mode: nil -*-
# vi: set ft=python sts=4 ts=4 sw=4 et:
from __future__ import print_function, division, absolute_import, unicode_literals

from .fmriplots import spikesplot, spikesplot_cb, fmricarpetplot
Loading