Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Centralized all deep learning segmentation tasks with new function "sct_deepseg" and refactored sct_download_data #2639

Merged
merged 127 commits into from May 22, 2020

Conversation

jcohenadad
Copy link
Member

@jcohenadad jcohenadad commented Mar 19, 2020

With the need to more custom deep learning segmentation models and architectures, as well as new tasks (e.g. multi-class segmentation, CSF segmentation, etc.), it becomes more convenient to have a single CLI function, called sct_deepseg, that calls a specific model.

One notable feature of this new function is that it will rely on the actively-maintained ivadomed package. Another notable change is that we will now rely on the Pytorch framework (vs. Tensorflow).

For now, this new function will not replace sct_deepseg_X, but will be complementary to the existing functions, so we can gather feedback from users before making these functions deprecated. The "big" change will likely be associated with a major release (e.g. 5.0.0).

Changes include:

  • Introduced new function sct_deepseg that can point to a specific model. Fixes Centralize deep-learning segmentation tools into a single entry point and module #2626
  • Centralize hard-coded access to path for models and data. Fixes Centralize hard-coded access to path for models and data #2160
  • Deep-learning models are installed under $SCT_DIR/models
    • New SCT variable: sct.__models_dir__
  • Added useful functions under sct.utils: add_suffix(), extract_fname(), tmp_create()
  • Implemented feature to set threshold for output segmentation
  • Installing default models during SCT installation by adding: sct_deepseg -download-default-models
  • Added ivadomed/master in requirements.txt
  • refactored sct_download_data and move some functions under spinalcordtoolbox.download

Todo in subsequent PRs:

Test this branch on a development ivadomed version

# Checkout
cd $SCT_DIR
git fetch origin
git checkout -b jca/2626-deepseg origin/jca/2626-deepseg
# Activate SCT's venv
source ${SCT_DIR}/python/etc/profile.d/conda.sh
conda activate venv_sct
# Create alias to new sct_deepseg executable
pip install -e .
cp python/envs/venv_sct/bin/sct_deepseg bin/
# Clone ivadomed repos and pip install it inside SCT venv
cd ~  # or wherever you'd like to download that
git clone https://github.com/neuropoly/ivado-medical-imaging.git
cd ivado-medical-imaging
pip install -e .

@jcohenadad jcohenadad added the feature category: new functionality label Mar 19, 2020
@jcohenadad jcohenadad added this to the 4.2.3 milestone Mar 19, 2020
@jcohenadad jcohenadad added the sct_deepseg context: Global entry point for all deep learning segmentation methods label Mar 30, 2020
Comment on lines 3 to 4
# This command-line tool is the interface for the deepseg API that performs segmentation using deep learning from the
# ivadomed package.
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

You can write

Suggested change
# This command-line tool is the interface for the deepseg API that performs segmentation using deep learning from the
# ivadomed package.
"""
This command-line tool is the interface for the deepseg API that performs segmentation using deep learning from the
ivadomed package.
"""

To get pretty module-level docstrings. (python treats leading comments as module docstrings as well, but retains the #s, so I think using a triple-string is more idiomatic.)

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

indeed-- fixed here 021f205

scripts/sct_deepseg.py Outdated Show resolved Hide resolved
:param param_deepseg: class ParamDeepseg: Segmentation parameters.
:return: fname_out: str: Output filename.
"""
nii_seg = segment_volume(folder_model, fname_image)
Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

note: import ivadomed as imed



MODELS = {
'cord-t2star': {'url': 'https://osf.io/8nsa6/',
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

The other day we said tha the version is "encoded" in the url --> is it correct? (just checking if we do not need a version or date key)

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

fixed in c03c647

"""
Check if model (.pt file) is installed under SCT directory
"""
if os.path.exists(os.path.join(self.folder, self.name + '.pt')):
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Should we test if the .json is also present? or could we reasonably assume that?

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

i was thinking about that. My reasoning here was that the json file is not "mandatory" to execute a segmentation. Only the model is, which is why I didn't check for it. Open for discussion.

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

My reasoning here was that the json file is not "mandatory" to execute a segmentation.

ah! actually i am wrong. It is mandatory! See: https://github.com/neuropoly/ivado-medical-imaging/blob/master/ivadomed/utils.py#L624.

Oh well, I will check for it then ;-)

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Done in b469daa

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Great !

'uqueensland_mice_sc':
{'url': 'https://osf.io/nu3ma/download?version=2',
'description': 'Cord segmentation on mouse MRI. Data from University of Queensland.'},
'uqueensland_mice_gm':
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

According to the convention this should be

Suggested change
'uqueensland_mice_gm':
'mice_uqueensland_gm':

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

fixed in 57b99e3

@charleygros
Copy link
Member

# Clone ivadomed repos and pip install it inside SCT venv
cd ~  # or wherever you'd like to download that
git pull https://github.com/neuropoly/ivado-medical-imaging.git
cd ivado-medical-imaging
pip install -e .

Minor comment:
For tests run by external users --> we should recommend to do a git clone instead of a git pull

@charleygros
Copy link
Member

I think the field threshold is missing from ../spinalcordtoolbox/models/t2star_sc/t2star_sc.json.

@charleygros
Copy link
Member

@jcohenadad : Please find herebelow the terminal output I got.

I followed the installation process you described in the first post of this PR.

  • manually added a threshold to the model metadata.
  • pull last version of the ivadomed PR (last commit, related to a typo in a nibabel function).
(venv_sct) chgroc@rosenberg:~/sct_ivado$ sct_deepseg -i sct_testing_data/t2s/t2s.nii.gz -m t2star_sc

--
Spinal Cord Toolbox (git-jca/2626-deepseg-53f3f5f03c9931f02e5f5c84b0a4d5cf0ba4e431)


Loaded 9 axial slices..
/home/GRAMES.POLYMTL.CA/chgroc/spinalcordtoolbox/python/envs/venv_sct/lib/python3.6/site-packages/torch/serialization.py:453: SourceChangeWarning: source code of class 'ivadomed.models.Decoder' has changed. you can retrieve the original source code by accessing the object's source attribute or set `torch.nn.Module.dump_patches = True` and use the patch tool to revert the changes.
  warnings.warn(msg, SourceChangeWarning)
/home/GRAMES.POLYMTL.CA/chgroc/spinalcordtoolbox/python/envs/venv_sct/lib/python3.6/site-packages/torch/serialization.py:453: SourceChangeWarning: source code of class 'ivadomed.models.UpConv' has changed. you can retrieve the original source code by accessing the object's source attribute or set `torch.nn.Module.dump_patches = True` and use the patch tool to revert the changes.
  warnings.warn(msg, SourceChangeWarning)
Traceback (most recent call last):
  File "/home/GRAMES.POLYMTL.CA/chgroc/spinalcordtoolbox/scripts/sct_deepseg.py", line 141, in <module>
    main()
  File "/home/GRAMES.POLYMTL.CA/chgroc/spinalcordtoolbox/scripts/sct_deepseg.py", line 136, in main
    sct.deepseg.core.segment_nifti(args.i, path_model, param)
  File "/home/GRAMES.POLYMTL.CA/chgroc/spinalcordtoolbox/spinalcordtoolbox/deepseg/core.py", line 50, in segment_nifti
    fname_out = sct.utils.add_suffix(fname_image, '_seg')
  File "/home/GRAMES.POLYMTL.CA/chgroc/spinalcordtoolbox/spinalcordtoolbox/utils.py", line 119, in add_suffix
    parent, stem, ext = extract_fname(fname)
NameError: name 'extract_fname' is not defined

@jcohenadad
Copy link
Member Author

I think the field threshold is missing from ../spinalcordtoolbox/models/t2star_sc/t2star_sc.json

indeed-- i added it manually-- we will need to add it to all models (along with the other postproc relevant fields)


# Postprocessing
metadata = sct.deepseg.models.get_metadata(folder_model)
postproc = PostProcessing(param, metadata)
Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@kousu is that an overkill to create a Class for this or should we simply define each postprocessing function with nii, param and metadata as input parameter?

Copy link
Contributor

@kousu kousu Apr 16, 2020

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I think a class doesn't offer anything here. A class is useful if you need to keep some state around for a while, but here you want to use the data immediately and get out.

I think what I would do is write it as

def postprocess(nii_seg, param, metadata):
   options = {**DEFAULTS, **metadata, **{k: v for k,v in param.__dict__.items() if v is not None}}
   if options['threshold']:
      nii_seg = threshold(nii_sig, options['threshold'])
   if options['keep_largest_object']:
      nii_seg = keep_largest_object(nii_sig)
   if options['fill_holes']:
      nii_seg = fill_holes(nii_sig)
   return nii_seg

or, if you take my suggestion about dropping ParamDeepseg, then

def postprocess(nii_seg, param, metadata):
   options = {**DEFAULTS, **metadata, **param}
   if options['threshold']:
      nii_seg = threshold(nii_sig, options['threshold'])
   if options['keep_largest_object']:
      nii_seg = keep_largest_object(nii_sig)
   if options['fill_holes']:
      nii_seg = fill_holes(nii_sig)
   return nii_seg

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

the "merge" approach is brilliant. Currently trying to implement it.

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

implemented in bea7a3c

nii_seg = imed.postprocessing.fill_holes(nii_seg)
else:
logger.warning("Algorithm 'fill holes' can only be run on binary segmentation. Skipping.")
return nii_seg
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This return is tabbed in one level too deep.

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

fixed in bea7a3c

"Neither I nor S is present in code: {}, for affine matrix: {}".format(code, affine))
nii_seg = imed.postprocessing.keep_largest_object_per_slice(nii_seg, axis=axis_infsup)
else:
logger.warning("Algorithm 'keep largest object' can only be run on binary segmentation. Skipping.")
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Could this warning be an exception? "Errors should never pass silently. Unless explicitly silenced." advocates the zen of python.

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

well, technically it is not an error but just a warning that this step will be skipped. If we stop the program here, i anticipate that all users who try the flag "-thr 0" will see a crash, and since most users don't read their crash report, they will wrongly think that "-thr 0" doesn't work... so i'd rather give them an output and informing the small percentage of users who actually read reports that some algo have not run because of -thr 0.
@charleygros do you have any opinion?

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I would agree that this is more a warning than an error, and would then agree to skip it.
Another approach would be to call the thresholding with a default value (eg 10e-3) in case the object is not binary (ie thr=0) when entering in keep_largest_object and then use the mask_prediction function to remove the small objects from the soft pred based on the binary output of keep_largest.

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Another approach would be to call the thresholding with a default value (eg 10e-3) in case the object is not binary (ie thr=0) when entering in keep_largest_object and then use the mask_prediction function to remove the small objects from the soft pred based on the binary output of keep_largest.

yup i like this idea!

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This should be implemented in ivadomed though, issue opened: ivadomed/ivadomed#205

Comment on lines 74 to 81
# TODO: This if/elif below is ugly. Cannot think of something better for now...
do_process = DEFAULT_KEEP_LARGEST_OBJECT
if self.param.keep_largest_object is True:
do_process = True
elif self.param.keep_largest_object is None:
if 'keep_largest_object' in self.metadata:
do_process = self.metadata.keep_largest_object
if do_process:
Copy link
Contributor

@kousu kousu Apr 16, 2020

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I thought about what possible values everything could take here. It looks like you want self.param taking precedence, then self.metadata, then falling back to DEFAULT_, with None or not in ... meaning "undefined, so try the next option".

I would say you want do_process = param or metadata or default, except that will misinterpret False as "continue to fallback" instead of "stop here".

I rewrote this expression as a function I could test exhaustively with all the possible values; I made one simplification that I don't think would change the meaning -- to represent 'keep_largest_object' not in self.metadata as metadata is None.

from pprint import pprint
def do_process(param, metadata, default):
    if param is True:
        return True
    elif param is None:
        if metadata is not None:
            return metadata
    return default

pprint(
  [(param, metadata, default, "=>", do_process(param, metadata, default))
  for param in [True, False, None]
  for metadata in [True, False, None]
  for default in [True, False]])

this gave me this truth table:

[(True, True, True, '=>', True),
 (True, True, False, '=>', True),
 (True, False, True, '=>', True),
 (True, False, False, '=>', True),
 (True, None, True, '=>', True),
 (True, None, False, '=>', True),
 (False, True, True, '=>', True),
 (False, True, False, '=>', False),
 (False, False, True, '=>', True),
 (False, False, False, '=>', False),
 (False, None, True, '=>', True),
 (False, None, False, '=>', False),
 (None, True, True, '=>', True),
 (None, True, False, '=>', True),
 (None, False, True, '=>', False),
 (None, False, False, '=>', False),
 (None, None, True, '=>', True),
 (None, None, False, '=>', False)]

In all cases where param is True, it's True (of course, that's written right into the logic). But when it's False it skips metadata and uses default instead.

Is that what you really meant? param=False and param=None both mean "continue", but metadata=False means "stop" yet metadata=None means "continue"? I think that's weird. I think the actual logic you want is

def do_process(param, metadata, default):
    if param is True:
        return True
    elif param is False:
        return False
    elif param is None:
        if metadata is not None:
            return metadata
    return default

which can be written shorter as

def do_process(param, metadata, default):
    if param is not None:
        return param
    if metadata is not None:
        return metadata
    return default

The truth table for this is

[(True, True, True, '=>', True),
 (True, True, False, '=>', True),
 (True, False, True, '=>', True),
 (True, False, False, '=>', True),
 (True, None, True, '=>', True),
 (True, None, False, '=>', True),
 (False, True, True, '=>', False),
 (False, True, False, '=>', False),
 (False, False, True, '=>', False),
 (False, False, False, '=>', False),
 (False, None, True, '=>', False),
 (False, None, False, '=>', False),
 (None, True, True, '=>', True),
 (None, True, False, '=>', True),
 (None, False, True, '=>', False),
 (None, False, False, '=>', False),
 (None, None, True, '=>', True),
 (None, None, False, '=>', False)]

Copy link
Contributor

@kousu kousu Apr 16, 2020

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

The other tricky thing I see is that 'undefined' is represented in two different ways: for self.param, it's represented by None, for self.metadata it's represented by an actual missing value. What should happen if someone set self.metadata['keep_largest_object'] = None? Or if someone did del self.param.keep_largest_object?

If you could change the representation this would be easier: delete ParamDeepseg and just tell users to pass a dictionary -- you could even write def __init__(self, metadata, **params). Without that, we're going to have to accept some baseline ugliness.


Anyway, if you accept my changing of your logic above and the need to deal with different representations, you can rewrite this with the ternary hook:

Suggested change
# TODO: This if/elif below is ugly. Cannot think of something better for now...
do_process = DEFAULT_KEEP_LARGEST_OBJECT
if self.param.keep_largest_object is True:
do_process = True
elif self.param.keep_largest_object is None:
if 'keep_largest_object' in self.metadata:
do_process = self.metadata.keep_largest_object
if do_process:
if (self.param.keep_largest_object
if self.param.keep_largest_object is not None
else self.metadata['keep_largest_object']
if 'keep_largest_object' in self.metadata
else DEFAULT_KEEP_LARGEST_OBJECT):

Python has a shortcut for the if .. in .. else part, dict.get(...[, default]):

Suggested change
# TODO: This if/elif below is ugly. Cannot think of something better for now...
do_process = DEFAULT_KEEP_LARGEST_OBJECT
if self.param.keep_largest_object is True:
do_process = True
elif self.param.keep_largest_object is None:
if 'keep_largest_object' in self.metadata:
do_process = self.metadata.keep_largest_object
if do_process:
if (self.param.keep_largest_object
if self.param.keep_largest_object is not None
else self.metadata.get('keep_largest_object',
DEFAULT_KEEP_LARGEST_OBJECT)):

Copy link
Contributor

@kousu kousu Apr 16, 2020

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

You could also map None => missing directly, and then make use of dict.get():

Suggested change
# TODO: This if/elif below is ugly. Cannot think of something better for now...
do_process = DEFAULT_KEEP_LARGEST_OBJECT
if self.param.keep_largest_object is True:
do_process = True
elif self.param.keep_largest_object is None:
if 'keep_largest_object' in self.metadata:
do_process = self.metadata.keep_largest_object
if do_process:
if ({k: v for k,v in self.param.__dict__.items() if v is not None}).get('keep_largest_object', self.metadata.get('keep_largest_object', DEFAULT_KEEP_LARGEST_OBJECT)):

Copy link
Contributor

@kousu kousu Apr 16, 2020

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Or what about viewing it explicitly as a merge? Apparently in Python3.9 you will be able to write defaults | metadata | param, but in the meantime we have this {**defaults, **metadata, **param} syntax:

DEFAULTS = {
  'threshold': 0.9,
  'keep_largest_object': True,
  'fill_holes': True,
}

class PostProcessing:
    """
    Deals with post-processing of the segmentation. Consider param (i.e. user's flag) with more priority than
    metadata (i.e. from model's json file).
    """
    def __init__(self, param, metadata):
        """
        :param param: dict: Defined by user's parameter
        :param metadata: dict: From model's json metadata
        """

        # param overrides metadata overrides defaults
        self.param = {**DEFAULTS,
                      **metadata,
                      **param}

        [...]

    def keep_largest_object(self, nii_seg):
        """
        Only keep largest object
        """
        if self.param['keep_largest_object']:
            [...]

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

merged strategy implemented in bea7a3c


def threshold(self, nii_seg):
"""
Threshold the prediction. For no prediction, set 'threshold' to 0.
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Is this a typo?

Suggested change
Threshold the prediction. For no prediction, set 'threshold' to 0.
Threshold the prediction. For no threshold, set 'threshold' to 0.

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Yes-- Fixed in ce8a184

Comment on lines 55 to 57
Threshold the prediction. For no prediction, set 'threshold' to 0.
"""
if self.param.threshold:
Copy link
Contributor

@kousu kousu Apr 16, 2020

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

You can use threshold = None to mean "no threshold" instead of letting any falsey value -- namely 0.0 -- trigger this. That would be more uniform with the other inputs.

Editorializing: It's a remnant of C's limited type system that we have this instinct to repurpose the meanings of specific values.

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

no more relevant with current implementation (ce8a184)

Comment on lines 72 to 74
# if urls is not a list, make it one
if not isinstance(urls, (list, tuple)):
urls = [urls]
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This should be a bit more reliable:

Suggested change
# if urls is not a list, make it one
if not isinstance(urls, (list, tuple)):
urls = [urls]
# if urls is not a list, make it one
if isinstance(urls, str):
urls = [urls]

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

indeed-- fixed in 88a04b5

Comment on lines 107 to 109
args = parser.parse_args(args=None if sys.argv[1:] else ['--help'])
# TODO: instead of assigning each args param, we could pass args while instanciating ParamDeepseg(args), and the
# class would deal with assigning arguments to each field.
Copy link
Contributor

@kousu kousu Apr 16, 2020

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Instead of using ParamDeepseg, you could do something like:

Suggested change
args = parser.parse_args(args=None if sys.argv[1:] else ['--help'])
# TODO: instead of assigning each args param, we could pass args while instanciating ParamDeepseg(args), and the
# class would deal with assigning arguments to each field.
args = parser.parse_args(args=None if sys.argv[1:] else ['--help'])
args ={k: v for k in vars(args).items() if v is not None}
# separate out the segmentation `param` args from the top level args
input = args.pop('i')
list_models = args.pop('list_models')
install_model = args.pop('install_model')
install_default_models = args.pop('install_default_models')
...

You'd have to remove all the default values/make sure they're set to None; default=param_default.keep_largest_object is an extra layer of defaults which might accidentally override a model's metadata even when the the user doesn't ask for it.

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

actually it fails because v is not defined

Copy link
Contributor

@kousu kousu Apr 16, 2020

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

You have to watch out because the argparse names don't match the postprocess() names. You have args.thr vs param.threshold and args.keep_largest vs param.keep_largest_object. You can use dest to fixup the naming, or just change the names of the flags in the CLI.

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

You could be explicit about which subset are destined for algorithm parameters instead of being using .pop() after the fact to pull off the other args:

params = ['keep_largest_object', 'fill_holes', 'threshold']
params = {k: vars(args)[k] for k in params if vars(args)[k] is not None}

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Oh sorry!! Let me try that again:

Suggested change
args = parser.parse_args(args=None if sys.argv[1:] else ['--help'])
# TODO: instead of assigning each args param, we could pass args while instanciating ParamDeepseg(args), and the
# class would deal with assigning arguments to each field.
args = parser.parse_args(args=None if sys.argv[1:] else ['--help'])
args ={k: v for k, v in vars(args).items() if v is not None}
# separate out the segmentation `param` args from the top level args
input = args.pop('i')
list_models = args.pop('list_models')
install_model = args.pop('install_model')
install_default_models = args.pop('install_default_models')
...

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@kousu how about not "popping" fields from args, and just passing args inside deepseg.core(args), because the two first arguments (fname_input and path_model) are already included in args.

Are you worried that we would be passing a bunch of irrelevant fields inside deepseg.core?

@jcohenadad
Copy link
Member Author

jcohenadad commented Apr 18, 2020

@kousu
Copy link
Contributor

kousu commented Apr 18, 2020

An important subtlety: disables pip's progress bar on Travis. This is a small change but it does mean Travis logs are not quite comparable after this point.

@kousu
Copy link
Contributor

kousu commented Apr 18, 2020

The build is failing at sct_check_dependencies:

Check if ivadomed@git+https://github.com/neuropoly/ivado-medical-imaging@master is installed[FAIL]

No module named 'ivadomed@git+https://github'

This is because

https://github.com/neuropoly/spinalcordtoolbox/blob/72bc861f02e2075b12a2e4fd04d6767e2086f3d8/scripts/sct_check_dependencies.py#L171-L178

does not parse pip VCS URLs.

My best suggestion is to remove sct_check_dependencies entirely and instead rely on set -e + pip install -r requirements.txt. Then even if there somehow is a missing dependency it will fail at runtime but I don't think that's a big loss, since the error will be obvious. Trying to import all the dependencies is all sct_check_dependencies is doing anyway, just up front:

https://github.com/neuropoly/spinalcordtoolbox/blob/72bc861f02e2075b12a2e4fd04d6767e2086f3d8/scripts/sct_check_dependencies.py#L290-L293

My second best suggestion is to edit that code to parse VCS URLs; but that seems tricky because the docs don't even document the pkg@vcs+protocol://... format ivadomed is using here; they say you have to use -e vcs+protocol://....#egg=pkg

@kousu kousu mentioned this pull request Apr 18, 2020
Copy link
Contributor

@kousu kousu left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This is good now. Let it in!

EDIT ..no it's not. CI is failing now.

Copy link
Contributor

@kousu kousu left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

We need to figure out why CI is broken.

Fixes recent crashes in CI due to new ivadomed 1.1 version
@jcohenadad
Copy link
Member Author

yess! CI is now passing. Issue was caused by ivadomed being recently bumped to 1.1, and this changed created compatibility issues with the current models (good thing our CI tests are sensitive 🙏). Issue was fixed by updating the URLs, which are now pointing to up-to-date models.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
feature category: new functionality sct_deepseg context: Global entry point for all deep learning segmentation methods sct_download_data context:
Projects
None yet
3 participants