Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Harmonize PET tracers handling (ML / DL / Stats) #137

Merged
merged 51 commits into from
Nov 3, 2020
Merged
Show file tree
Hide file tree
Changes from 49 commits
Commits
Show all changes
51 commits
Select commit Hold shift + click to select a range
17adcc8
Remove lower/upper case of PET tracer
alexandreroutier Oct 8, 2020
29dd415
Harmonize ml-prepare-data w.r.t. PET tracers
alexandreroutier Oct 8, 2020
69eda8c
Use -acq/--acq_label flag instead of -al/--acq_label
alexandreroutier Oct 8, 2020
596c570
Add path option to get_file_from_server
alexandreroutier Oct 18, 2020
b0190a9
Harmonize PET tracers handling (stats-surface)
alexandreroutier Oct 19, 2020
b3f4900
Add pet_volume_normalized_suvr_pet function
alexandreroutier Oct 19, 2020
670597f
Add group_label to pet_volume_normalized_suvr_pet
alexandreroutier Oct 19, 2020
6586ecc
Improve pet_volume_normalized_suvr_pet description
alexandreroutier Oct 20, 2020
317a55f
Harmonize PET tracers handling (stats-volume)
alexandreroutier Oct 20, 2020
ed7bdae
Use f-string for custom-pipeline/pet-* pipeline CLI
alexandreroutier Oct 20, 2020
c0fbe5b
Set False as default value for use_pvc_data
alexandreroutier Oct 20, 2020
2e83b7b
Harmonize PET tracers handling (ml-spatial-svm)
alexandreroutier Oct 20, 2020
10b4551
Immprove readibility of get_pet_surface_custom_file
alexandreroutier Oct 20, 2020
a91fe91
Document get_file_from_server
alexandreroutier Oct 20, 2020
9bf26b4
Centralize list of SUVR regions
alexandreroutier Oct 20, 2020
27ecf51
Format SUVR utils
alexandreroutier Oct 20, 2020
1b44532
Add PET_Introduction page
alexandreroutier Oct 20, 2020
0ff0dce
Unify how volume atlases are handled
alexandreroutier Oct 20, 2020
478029d
Update region_based_io.py
alexandreroutier Oct 21, 2020
86423f9
Update voxel_based_io.py
alexandreroutier Oct 21, 2020
8264d11
Harmonize PET tracers handling (ML modules)
alexandreroutier Oct 21, 2020
e18a076
Remove get_caps_t1_list/get_caps_pet_list
alexandreroutier Oct 22, 2020
25df672
Fix how CAPS files are grabbed
alexandreroutier Oct 22, 2020
83c758a
Add missing default keys to CAPSVertexBasedInput
alexandreroutier Oct 23, 2020
d144358
Harmonize PET tracers handling (ML-Workflows modules)
alexandreroutier Oct 23, 2020
2ca7ce5
Remove cprint() displaye used for debug
alexandreroutier Oct 23, 2020
7ec3c67
Improve description of --acq_label flag
alexandreroutier Oct 23, 2020
ec9699f
Improve PET Introduction page
alexandreroutier Oct 23, 2020
39c9a43
Explain how to add new volume atlas to Clinica
alexandreroutier Oct 27, 2020
b2bb50e
Update PET-Volume page w.r.t. tracers harmonization
alexandreroutier Oct 27, 2020
c601f2f
Update PET-Surface page w.r.t. tracers harmonization
alexandreroutier Oct 28, 2020
61ef52c
Improve CLI description of statistics-volume
alexandreroutier Oct 28, 2020
44261c3
Update Stats-Volume page w.r.t. tracers harmonization
alexandreroutier Oct 28, 2020
9fe2028
Generalize SUVR region for amyloid tracer
alexandreroutier Oct 28, 2020
14556e5
Proofreading
nburgos Oct 28, 2020
1584be4
Proofreading
nburgos Oct 28, 2020
83b09f0
Proofreading
nburgos Oct 28, 2020
17c8c78
Proofreading
nburgos Oct 28, 2020
df621d8
Update ML-SpatialSVM page w.r.t. tracers harmonization
alexandreroutier Oct 28, 2020
c81d0d4
Enforce <participant_id>/<session_id> wording
alexandreroutier Oct 28, 2020
66dbf8e
Update Stats-Surface page w.r.t. tracers harmonization
alexandreroutier Oct 28, 2020
58bba01
Update ML page w.r.t. tracers harmonization
alexandreroutier Oct 28, 2020
100f54f
Update index.md
alexandreroutier Oct 28, 2020
423063f
Proofreading
nburgos Oct 28, 2020
128f18c
Proofreading
nburgos Oct 28, 2020
efb7032
Proofreading
nburgos Oct 28, 2020
82e527c
Proofreading
nburgos Oct 28, 2020
3411629
Proofreading
nburgos Oct 28, 2020
047e037
Proofreading
nburgos Oct 28, 2020
40a7d4f
Improve description of acq_label flag
alexandreroutier Oct 29, 2020
022b72e
Clean get_file_from_server function
alexandreroutier Oct 29, 2020
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
370 changes: 231 additions & 139 deletions clinica/pipelines/machine_learning/input.py

Large diffs are not rendered by default.

562 changes: 414 additions & 148 deletions clinica/pipelines/machine_learning/ml_workflows.py

Large diffs are not rendered by default.

67 changes: 6 additions & 61 deletions clinica/pipelines/machine_learning/region_based_io.py
Original file line number Diff line number Diff line change
Expand Up @@ -4,60 +4,6 @@
import numpy as np
import pandas as pd
import nibabel as nib
from os.path import join


def get_caps_t1_list(input_directory, subjects_visits_tsv, group_label, atlas_id):
"""
path to arrive to the list of the file with the statistics on atlas_id
Args:
input_directory:
subjects_visits_tsv:
group_label:
atlas_id:

Returns:

"""

from os.path import join
import pandas as pd

subjects_visits = pd.io.parsers.read_csv(subjects_visits_tsv, sep='\t')
if list(subjects_visits.columns.values) != ['participant_id', 'session_id']:
raise Exception('Subjects and visits file is not in the correct format.')
subjects = list(subjects_visits.participant_id)
sessions = list(subjects_visits.session_id)
image_list = [join(input_directory + '/subjects/' + subjects[i] + '/'
+ sessions[i] + '/t1/spm/dartel/group-' + group_label + '/atlas_statistics/' + subjects[i] + '_'
+ sessions[i]+'_T1w_space-'+atlas_id+'_map-graymatter_statistics.tsv')
for i in range(len(subjects))]
return image_list


def get_caps_pet_list(input_directory, subjects_visits_tsv, group_label, atlas_id):
"""

Args:
input_directory:
subjects_visits_tsv:
group_label:
atlas_id:

Returns:

"""

subjects_visits = pd.io.parsers.read_csv(subjects_visits_tsv, sep='\t')
if list(subjects_visits.columns.values) != ['participant_id', 'session_id']:
raise Exception('Subjects and visits file is not in the correct format.')
subjects = list(subjects_visits.participant_id)
sessions = list(subjects_visits.session_id)
image_list = [join(input_directory, 'analysis-series-default/subjects/' + subjects[i] + '/'
+ sessions[i] + '/pet/atlas_statistics/' + subjects[i] + '_' + sessions[i]
+ '_space-' + atlas_id + '_map-fdgstatistic2.tsv')
for i in range(len(subjects))]
return image_list


def load_data(image_list, subjects):
Expand Down Expand Up @@ -103,12 +49,14 @@ def features_weights(image_list, dual_coefficients, sv_indices, scaler=None):
"""

if len(sv_indices) != len(dual_coefficients):
print("Length dual coefficients: " + str(len(dual_coefficients)))
print("Length indices: " + str(len(sv_indices)))
raise ValueError('The number of support vectors indices and the number of coefficients must be the same.')
raise ValueError(
f"The number of support vectors indices and the number of coefficients must be the same.\n"
f"- Number of dual coefficients: {len(dual_coefficients)}\n"
f"- Number of indices:: {len(sv_indices)}\n"
)

if len(image_list) == 0:
raise ValueError('The number of images must be greater than 0.')
raise ValueError("The number of images must be greater than 0.")

sv_images = [image_list[i] for i in sv_indices]

Expand All @@ -134,9 +82,6 @@ def weights_to_nifti(weights, atlas, output_filename):
Returns:

"""

from os.path import join, split, realpath

from clinica.utils.atlas import AtlasAbstract

atlas_path = None
Expand Down
37 changes: 0 additions & 37 deletions clinica/pipelines/machine_learning/voxel_based_io.py
Original file line number Diff line number Diff line change
@@ -1,44 +1,7 @@
# coding: utf8

import numpy as np
import pandas as pd
import nibabel as nib
from os.path import join


def get_caps_t1_list(input_directory, subjects_visits_tsv, group_label, fwhm, modulated):

subjects_visits = pd.io.parsers.read_csv(subjects_visits_tsv, sep='\t')
if list(subjects_visits.columns.values) != ['participant_id', 'session_id']:
raise Exception('Subjects and visits file is not in the correct format.')
subjects = list(subjects_visits.participant_id)
sessions = list(subjects_visits.session_id)
if fwhm == 0:
image_list = [join(input_directory, 'subjects/' + subjects[i] + '/'
+ sessions[i] + '/t1/spm/dartel/group-' + group_label + '/'
+ subjects[i] + '_' + sessions[i] + '_T1w_segm-graymatter'+'_space-Ixi549Space_modulated-'+modulated+'_probability.nii.gz') for i in range(len(subjects))]
else:
image_list = [join(input_directory, 'subjects/' + subjects[i] + '/'
+ sessions[i] + '/t1/spm/dartel/group-' + group_label + '/'
+ subjects[i] + '_' + sessions[i] + '_T1w_segm-graymatter' + '_space-Ixi549Space_modulated-' + modulated + '_fwhm-'+fwhm+'mm_probability.nii.gz')
for i in range(len(subjects))]

return image_list


def get_caps_pet_list(input_directory, subjects_visits_tsv, group_label, pet_type):

subjects_visits = pd.io.parsers.read_csv(subjects_visits_tsv, sep='\t')
if list(subjects_visits.columns.values) != ['participant_id', 'session_id']:
raise Exception('Subjects and visits file is not in the correct format.')
subjects = list(subjects_visits.participant_id)
sessions = list(subjects_visits.session_id)

image_list = [join(input_directory, 'subjects/' + subjects[i] + '/'
+ sessions[i] + '/pet/preprocessing/group-' + group_label + '/' + subjects[i]
+ '_' + sessions[i] + '_task-rest_acq-' + pet_type + '_pet_space-Ixi549Space_pet.nii.gz') for i in range(len(subjects))]

return image_list


def load_data(image_list, mask=True):
Expand Down
51 changes: 39 additions & 12 deletions clinica/pipelines/machine_learning_spatial_svm/spatial_svm_cli.py
Original file line number Diff line number Diff line change
Expand Up @@ -18,6 +18,8 @@ def define_options(self):
"""Define the sub-command arguments."""
from colorama import Fore
from clinica.engine.cmdparser import PIPELINE_CATEGORIES
from clinica.utils.pet import LIST_SUVR_REFERENCE_REGIONS

# Clinica compulsory arguments (e.g. BIDS, CAPS, group_label)
clinica_comp = self._args.add_argument_group(PIPELINE_CATEGORIES['CLINICA_COMPULSORY'])
clinica_comp.add_argument("caps_directory",
Expand All @@ -30,17 +32,24 @@ def define_options(self):
'pet-volume' to use SUVr maps.''',
choices=['t1-volume', 'pet-volume'],
)
# Optional arguments
# Optional arguments for inputs from pet-volume pipeline
optional_pet = self._args.add_argument_group(
'%sPipeline options if you use inputs from pet-volume pipeline%s' %
(Fore.BLUE, Fore.RESET)
f"{Fore.BLUE}Pipeline options if you use inputs from pet-volume pipeline{Fore.RESET}"
)
optional_pet.add_argument("-pt", "--pet_tracer",
default='fdg',
help='PET tracer. Can be fdg or av45 (default: --pet_tracer %(default)s)')
optional_pet.add_argument("-no_pvc", "--no_pvc",
action='store_true', default=False,
help="Force the use of non PVC PET data (by default, PVC PET data are used)")
optional_pet.add_argument("-acq", "--acq_label",
type=str,
default=None,
help='Name of the label given to the acquisition, specifying the tracer used (acq-<acq_label>).')
optional_pet.add_argument("-suvr", "--suvr_reference_region",
choices=LIST_SUVR_REFERENCE_REGIONS,
default=None,
help='Intensity normalization using the average PET uptake in reference regions '
'resulting in a standardized uptake value ratio (SUVR) map. It can be '
'cerebellumPons (used for amyloid tracers) or pons (used for 18F-FDG tracers).')
optional_pet.add_argument("-pvc", "--use_pvc_data",
action='store_true',
default=False,
help="Use PET data with partial value correction (by default, PET data with no PVC are used)")
# Clinica standard arguments (e.g. --n_procs)
self.add_clinica_standard_arguments()
# Advanced arguments (i.e. tricky parameters)
Expand All @@ -54,15 +63,33 @@ def define_options(self):
def run_command(self, args):
"""Run the pipeline with defined args."""
from networkx import Graph
from colorama import Fore
from .spatial_svm_pipeline import SpatialSVM
from clinica.utils.exceptions import ClinicaException
from clinica.utils.ux import print_end_pipeline, print_crash_files_and_exit

if args.orig_input_data == 'pet-volume':
if args.acq_label is None:
raise ClinicaException(
f"{Fore.RED}You selected pet-volume pipeline without setting --acq_label flag. "
f"Clinica will now exit.{Fore.RESET}"
)
if args.suvr_reference_region is None:
raise ClinicaException(
f"{Fore.RED}You selected pet-volume pipeline without setting --suvr_reference_region flag. "
f"Clinica will now exit.{Fore.RESET}"
)

parameters = {
# Clinica compulsory arguments
'group_label': args.group_label,
'orig_input_data': args.orig_input_data,
'pet_tracer': args.pet_tracer,
'no_pvc': args.no_pvc,
'fwhm': args.fwhm,
# Optional arguments for inputs from pet-volume pipeline
'acq_label': args.acq_label,
'use_pvc_data': args.use_pvc_data,
'suvr_reference_region': args.suvr_reference_region,
# Advanced arguments
'fwhm': args.full_width_half_maximum,
}
pipeline = SpatialSVM(
caps_directory=self.absolute_path(args.caps_directory),
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -16,18 +16,20 @@ def check_pipeline_parameters(self):
"""Check pipeline parameters."""
from clinica.utils.group import check_group_label

if 'group_label' not in self.parameters.keys():
raise KeyError('Missing compulsory group_label key in pipeline parameter.')
# Clinica compulsory parameters
self.parameters.setdefault('group_label', None)
check_group_label(self.parameters['group_label'])

if 'orig_input_data' not in self.parameters.keys():
raise KeyError('Missing compulsory orig_input_data key in pipeline parameter.')
if 'fwhm' not in self.parameters.keys():
self.parameters['fwhm'] = 4
if 'pet_tracer' not in self.parameters.keys():
self.parameters['pet_tracer'] = 'fdg'
if 'no_pvc' not in self.parameters.keys():
self.parameters['no_pvc'] = False

check_group_label(self.parameters['group_label'])
# Optional parameters for inputs from pet-volume pipeline
self.parameters.setdefault('acq_label', None)
self.parameters.setdefault('suvr_reference_region', None)
self.parameters.setdefault('use_pvc_data', False)

# Advanced parameters
self.parameters.setdefault('fwhm', 4)

def check_custom_dependencies(self):
"""Check dependencies that can not be listed in the `info.json` file.
Expand All @@ -53,14 +55,14 @@ def get_output_fields(self):
return ['regularized_image']

def build_input_node(self):
"""Build and connect an input node to the pipeline.
"""
"""Build and connect an input node to the pipeline."""
import os
from colorama import Fore
import nipype.pipeline.engine as npe
import nipype.interfaces.utility as nutil
from clinica.utils.inputs import clinica_file_reader, clinica_group_reader
from clinica.utils.input_files import t1_volume_final_group_template
from clinica.utils.input_files import (t1_volume_final_group_template,
pet_volume_normalized_suvr_pet)
from clinica.utils.exceptions import ClinicaCAPSError, ClinicaException
from clinica.utils.ux import print_groups_in_caps_directory

Expand All @@ -84,23 +86,26 @@ def build_input_node(self):
'description': 'graymatter tissue segmented in T1w MRI in Ixi549 space',
'needed_pipeline': 't1-volume-tissue-segmentation'
}
elif self.parameters['orig_input_data'] is 'pet-volume':
if self.parameters['no_pvc']:
caps_files_information = {
'pattern': os.path.join('pet', 'preprocessing', 'group-' + self.parameters['group_label'],
'*_pet_space-Ixi549Space_suvr-pons_pet.nii.gz'),
'description': self.parameters['pet_tracer'] + ' PET in Ixi549 space',
'needed_pipeline': 'pet-volume'
}
else:
caps_files_information = {
'pattern': os.path.join('pet', 'preprocessing', 'group-' + self.parameters['group_label'],
'*_pet_space-Ixi549Space_pvc-rbv_suvr-pons_pet.nii.gz'),
'description': self.parameters['pet_tracer'] + ' PET partial volume corrected (RBV) in Ixi549 space',
'needed_pipeline': 'pet-volume with PVC'
}
elif self.parameters['orig_input_data'] == 'pet-volume':
if not (
self.parameters["acq_label"]
and self.parameters["suvr_reference_region"]
):
raise ValueError(
f"Missing value(s) in parameters from pet-volume pipeline. Given values:\n"
f"- acq_label: {self.parameters['acq_label']}\n"
f"- suvr_reference_region: {self.parameters['suvr_reference_region']}\n"
f"- use_pvc_data: {self.parameters['use_pvc_data']}\n"
)
caps_files_information = pet_volume_normalized_suvr_pet(
acq_label=self.parameters["acq_label"],
suvr_reference_region=self.parameters["suvr_reference_region"],
use_brainmasked_image=False,
use_pvc_data=self.parameters["use_pvc_data"],
fwhm=0
)
else:
raise ValueError('Image type ' + self.parameters['orig_input_data'] + ' unknown.')
raise ValueError(f"Image type {self.parameters['orig_input_data']} unknown.")

try:
input_image = clinica_file_reader(self.subjects,
Expand Down Expand Up @@ -133,14 +138,11 @@ def build_input_node(self):
])

def build_output_node(self):
"""Build and connect an output node to the pipeline.
"""
"""Build and connect an output node to the pipeline."""
pass

def build_core_nodes(self):
"""Build and connect the core nodes of the pipeline.
"""

"""Build and connect the core nodes of the pipeline."""
import clinica.pipelines.machine_learning_spatial_svm.spatial_svm_utils as utils
import nipype.interfaces.utility as nutil
import nipype.pipeline.engine as npe
Expand Down
8 changes: 4 additions & 4 deletions clinica/pipelines/pet_surface/pet_surface_cli.py
Original file line number Diff line number Diff line change
Expand Up @@ -17,19 +17,19 @@ def define_description(self):
def define_options(self):
"""Define the sub-command arguments."""
from clinica.engine.cmdparser import PIPELINE_CATEGORIES
from clinica.utils.pet import LIST_SUVR_REFERENCE_REGIONS
# Clinica compulsory arguments (e.g. BIDS, CAPS, group_label)
clinica_comp = self._args.add_argument_group(PIPELINE_CATEGORIES['CLINICA_COMPULSORY'])
clinica_comp.add_argument("bids_directory",
help='Path to the BIDS directory.')
clinica_comp.add_argument("caps_directory",
help='Path to the CAPS directory. (Filled with results from t1-freesurfer pipeline')
clinica_comp.add_argument("acq_label", type=str,
help='Name of the PET tracer label in the acquisition entity '
'(acq-<acq_label>).')
clinica_comp.add_argument("suvr_reference_region", choices=['cerebellumPons', 'pons'],
help='Name of the label given to the acquisition, specifying the tracer used (acq-<acq_label>).')
clinica_comp.add_argument("suvr_reference_region", choices=LIST_SUVR_REFERENCE_REGIONS,
help='Intensity normalization using the average PET uptake in reference regions '
'resulting in a standardized uptake value ratio (SUVR) map. It can be '
'cerebellumPons (used for AV45 tracers) or pons (used for 18F-FDG tracers).')
'cerebellumPons (used for amyloid tracers) or pons (used for 18F-FDG tracers).')
clinica_comp.add_argument("pvc_psf_tsv",
help='TSV file containing for each PET image its point spread function (PSF) measured '
'in mm at x, y & z coordinates. Columns must contain: '
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -29,7 +29,7 @@ def define_options(self):
clinica_comp.add_argument("suvr_reference_region", choices=['cerebellumPons', 'pons'],
help='Intensity normalization using the average PET uptake in reference regions '
'resulting in a standardized uptake value ratio (SUVR) map. It can be '
'cerebellumPons (used for AV45 tracers) or pons (used for 18F-FDG tracers).')
'cerebellumPons (used for amyloid tracers) or pons (used for 18F-FDG tracers).')
clinica_comp.add_argument("pvc_psf_tsv",
help='TSV file containing for each PET image its point spread function (PSF) measured '
'in mm at x, y & z coordinates. Columns must contain: '
Expand Down
8 changes: 4 additions & 4 deletions clinica/pipelines/pet_surface/pet_surface_pipeline.py
Original file line number Diff line number Diff line change
Expand Up @@ -157,10 +157,10 @@ def build_input_node_longitudinal(self):

check_relative_volume_location_in_world_coordinate_system('T1w-MRI (orig_nu.mgz)',
read_parameters_node.inputs.orig_nu,
self.parameters['acq_label'].upper() + ' PET',
self.parameters['acq_label'] + ' PET',
read_parameters_node.inputs.pet,
self.bids_directory,
self.parameters['acq_label'].lower())
self.parameters['acq_label'])

self.connect([
(read_parameters_node, self.input_node, [('pet', 'pet'),
Expand Down Expand Up @@ -260,9 +260,9 @@ def build_input_node_cross_sectional(self):
raise ClinicaException(error_message)

check_relative_volume_location_in_world_coordinate_system('T1w-MRI (orig_nu.mgz)', read_parameters_node.inputs.orig_nu,
self.parameters['acq_label'].upper() + ' PET', read_parameters_node.inputs.pet,
self.parameters['acq_label'] + ' PET', read_parameters_node.inputs.pet,
self.bids_directory,
self.parameters['acq_label'].lower())
self.parameters['acq_label'])

self.connect([
(read_parameters_node, self.input_node, [('pet', 'pet'),
Expand Down
Loading