New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

A sample nipype interface for fit_tensor #332

Closed
wants to merge 3 commits into
base: master
from

Conversation

Projects
None yet
5 participants
@MrBago
Contributor

MrBago commented Feb 26, 2014

This is what an interface for dipy_fit_tensor might look like. It even works with Chris's nipype_cmd script!

Please let me know if I made any mistakes implementing the nipype interface, but hopefully there is enough here to get some discussion going.

I think I'll need to re-factor the fit_tensor function again, I'd like to add some IO abstraction so that images can be passed in as either file names or some sort of image objects. Does nipype have anything like this already? Also I'm thinking about how we would best drop the dipy_fit_tensor script, and replace it with nipype_cmd. One way would be for dipy to add nipype as a dependency. Alternatively if our scripts had input and run attributes that might almost be enough to make nipype_cmd work.

Any thoughts @chrisfilo @Garyfallidis @oesteban?

fisher ~/.dipy/stanford_hardi$ nipype_cmd dipy.workflows.interfaces DipyFitTensor HARDI150.bvec HARDI150.nii.gz
setting function inputs

bvec = HARDI150.bvec
dwi_data = HARDI150.nii.gz
ignore_exception = False
mask = <undefined>
min_signal = <undefined>
nomask = <undefined>
root = <undefined>
scale = <undefined>


ad_map = HARDI150_ad.nii.gz
dirfa_map = HARDI150_dirFA.nii.gz
dummy_trk_file = HARDI150_dummy.trk
fa_map = HARDI150_fa.nii.gz
mask_file = HARDI150_mask.nii.gz
md_map = HARDI150_md.nii.gz
rd_map = HARDI150_rd.nii.gz
t2di = HARDI150_t2di.nii.gz
tensor = <undefined>
@arokem

This comment has been minimized.

Member

arokem commented Feb 26, 2014

I missed the discussion on this, so you are probably ahead of me on this,
but I will still go ahead and ask: does this belong in dipy? Shouldn't all
this code be in nipype? I remember that's what we did when we built an
interface for nitime.

On Tue, Feb 25, 2014 at 7:33 PM, MrBago notifications@github.com wrote:

This is what an interface for dipy_fit_tensor might look like. It even
works with Chris's nipype_cmd script!

Please let me know if I made any mistakes implementing the nipype
interface, but hopefully there is enough here to get some discussion going.

I think I'll need to re-factor the fit_tensor function again, I'd like to
add some IO abstraction so that images can be passed in as either file
names or some sort of image objects. Does nipype have anything like this
already? Also I'm thinking about how we would best drop the dipy_fit_tensor
script, and replace it with nipype_cmd. One way would be for dipy to add
nipype as a dependency. Alternatively if our scripts had input and run
attributes that might almost be enough to make nipype_cmd work.

Any thoughts @chrisfilo https://github.com/chrisfilo @Garyfallidishttps://github.com/Garyfallidis
@oesteban https://github.com/oesteban?

fisher ~/.dipy/stanford_hardi$ nipype_cmd dipy.workflows.interfaces DipyFitTensor HARDI150.bvec HARDI150.nii.gz
setting function inputs

bvec = HARDI150.bvec
dwi_data = HARDI150.nii.gz
ignore_exception = False
mask =
min_signal =
nomask =
root =
scale =

ad_map = HARDI150_ad.nii.gz
dirfa_map = HARDI150_dirFA.nii.gz
dummy_trk_file = HARDI150_dummy.trk
fa_map = HARDI150_fa.nii.gz
mask_file = HARDI150_mask.nii.gz
md_map = HARDI150_md.nii.gz
rd_map = HARDI150_rd.nii.gz
t2di = HARDI150_t2di.nii.gz
tensor =


You can merge this Pull Request by running

git pull https://github.com/MrBago/dipy nipype_wip

Or view, comment on, or merge it at:

#332
Commit Summary

  • NF: re-write dipy_fit_tensor as function so that it can be called as
    a nipype
  • NF: added nipype interface for fit_tensor
  • BF - added workflows to setup.py

File Changes

Patch Links:


Reply to this email directly or view it on GitHubhttps://github.com//pull/332
.

@MrBago

This comment has been minimized.

Contributor

MrBago commented Feb 26, 2014

Yep the interface stuff will live in nipype, I did this mostly to help us to work out some of the design pieces. Once we're ready to go live with it I'll break it up into two pull requests.

@arokem

This comment has been minimized.

Member

arokem commented Feb 26, 2014

Cool. I can see how the 'workflows' module is going to be a very useful
addition. This will be a good place to add patterns that we have been using
in preparing the examples: a workflow for finding corpus callosum, for
calculating the noise levels, for tracking, etc.

On Wed, Feb 26, 2014 at 8:03 AM, MrBago notifications@github.com wrote:

Yep the interface stuff will live in nipype, I did this mostly to help us
to work out some of the design pieces. Once we're ready to go live with it
I'll break it up into two pull requests.


Reply to this email directly or view it on GitHubhttps://github.com//pull/332#issuecomment-36141485
.

help="use BET by default, --mask=none to not use mask")
parser.add_option("-m","--mask",default="median_otsu",
help="auto-mask by default, --mask=none to not use mask, "
"--mask=file.nii.gz for custom mask")
parser.add_option("--threshold",help="threshold passed to BET", default='.2')

This comment has been minimized.

@Garyfallidis

Garyfallidis Mar 16, 2014

Member

You have removed bet please update help correctly.

from dipy.io.bvectxt import read_bvec_file, orientation_to_string
from nibabel.trackvis import empty_header, write
from dipy.workflows.fit_tensor import fit_tensor
usage = """fit_tensor [options] dwi_images"""
parser = OptionParser(usage)
parser.add_option("-b","--bvec",help="text file with gradient directions")

This comment has been minimized.

@Garyfallidis

Garyfallidis Mar 16, 2014

Member

What I don't like here is that we don't support having bvalues and bvectors in different files. This is counter intuitive for someone who has played with the dipy and fsl tutorials. We need to support these files too.It should be trivial we have all the helper functions already implemented for this. Actually, I think you wrote many of these @MrBago.

def saveqform(img, filename):
"""Some DTI/tools require the qform code to be 1. We set the affine, qform,
and sfrom to be the same for maximum portibility.

This comment has been minimized.

@Garyfallidis

Garyfallidis Mar 16, 2014

Member

typo: portability

def saveqform(img, filename):
"""Some DTI/tools require the qform code to be 1. We set the affine, qform,

This comment has been minimized.

@Garyfallidis

Garyfallidis Mar 16, 2014

Member

Which dti tools require this? Why nibabel does not support that by default?

else:
root = root
if bvec is None:

This comment has been minimized.

@Garyfallidis

Garyfallidis Mar 16, 2014

Member

Okay, here we definitely need to support the case when bvalues and bvectors are different files which is what most people already have.

saveqform(nib.Nifti1Image(ten.md.astype("float32"), affine), out['md_map'])
saveqform(nib.Nifti1Image(ten.fa.astype("float32"), affine), out['fa_map'])
dfa = np.abs(ten.fa[..., None] * ten.evecs[..., 0])

This comment has been minimized.

@Garyfallidis

Garyfallidis Mar 16, 2014

Member

We have now a function for color fa. I would recommend to use that here.

dfa.shape = dfa.shape[:-1]
saveqform(nib.Nifti1Image(dfa, affine), out['dirfa_map'])
trk_hdr = empty_header()

This comment has been minimized.

@Garyfallidis

Garyfallidis Mar 16, 2014

Member

Why the trackvis header is useful here? Isn't this file only for tensor creation?

trk_hdr['vox_to_ras'] = affine
# One streamline with two points at [0, 0, 0]
dummy_track = [(np.zeros((2,3), dtype='float32'), None, None)]
write(out['dummy_trk_file'], dummy_track, trk_hdr)

This comment has been minimized.

@Garyfallidis

Garyfallidis Mar 16, 2014

Member

Yeah, this looks irrelevant. Please remove.

dummy_track = [(np.zeros((2,3), dtype='float32'), None, None)]
write(out['dummy_trk_file'], dummy_track, trk_hdr)
return out

This comment has been minimized.

@Garyfallidis

Garyfallidis Mar 16, 2014

Member

@mdesco, @jchoude can you share your tensor fit script from scilpy with @MrBago so he can make a hybrid and more advanced version of the two?

This comment has been minimized.

@samuelstjean

samuelstjean Mar 17, 2014

Contributor

This is what we have upstream for the dti script, we are halfway rebuilding them to be more uniform, but that gives some groundwork to start on at least.

http://pastebin.com/LAdKQpM4

from nipype.interfaces.base import (BaseInterfaceInputSpec, BaseInterface,
isdefined, TraitedSpec)
from dipy.workflows.fit_tensor import ft_out as fit_tensor_outputs

This comment has been minimized.

@Garyfallidis

Garyfallidis Mar 16, 2014

Member

Cool, this file will obviously go into Nipype as we discussed in the hangout. I like it @MrBago !!! @chrisfilo what do you think?

This comment has been minimized.

@chrisfilo

chrisfilo Mar 18, 2014

Member

Looks good to me!

@Garyfallidis

This comment has been minimized.

Member

Garyfallidis commented Mar 16, 2014

Okay, in general I very much like Bago's suggestion. I like the idea of making a workflow module where we keep there the main corpus of a recommended pipeline like the fit_tensor or fit_csd etc. And then have a small wrapper for that function in dipy/bin as a command line and an interface in nipype. So, this is definitely a +1 from me.

However, what we should definitely do is to ensure that these workflows always work correctly. For this purpose we need invent ways to check the quality of the results offline. For example, how we could know that when the dipy_fit_tensor is running is working correctly? If we figure this out then Dipy will be in a very strong position as it will have 3 different levels of sanity checks:

a) test individual functions with nose (done)
b) run the tutorials (done but not automated)
c) check quality for the recommended command lines e.g. dipy_fit_tensor! (needs thinking)

For c) we should have it working with relatively large datasets maybe once a day or so. Maybe one linux computer doing that job every night would be enough for start. I know that @matthew-brett run the nipy I think examples ones a day in the buildbots. We could do something similar but first we need ideas to see how we can check that the commands work correctly.

@samuelstjean

This comment has been minimized.

Contributor

samuelstjean commented Mar 17, 2014

For c) I think Alexandre Gauvin is working on something similar, an insurance quality tool for pipelines, you should go talk to him.

@MrBago

This comment has been minimized.

Contributor

MrBago commented Mar 17, 2014

Hi guys, there are some parts of this PR I would like some help on.

In dipy we have the fit_tenor function and the fit_tensor output dictionary. We also have a script which uses argparse to give command line access to this function.

in nipype we have the FitTensor interface and FitTensor input spec, the nipype output spec is mostly auto-generated from the dipy output dictionary.

Some possible improvements, in no specific order:

  1. It's a little odd of have the input spec in nipype and the output spec in dipy, but for a first pass this was the easiest way for me to build this. Is there a better way for us to organize this? The main obstacle is that input spec seems like it needs traits so I had to move it from dipy to nipype.

  2. The dipy_fit_tensor script and chris's nipype_cmd script essentially accomplish the same thing. Is there some way to leverage this so that A) there's less work for developers creating these workflows/command-line-tools and B) users coming from the nipype side of things and the dipy side of things have similar experiences. B will help us optimize the experiences of both sets of uses with half as much work.

  3. Could we structure the IO in the fit_tensor function better? The goal here would be to better separate out the more general IO logic from the more specific logic of this script. Maybe we could have a CommandLineTools class that handled some of the base logic that other command line tools would inherit from.

We don't need answers to all these questions before we move forward, we can always add improvements as we think of them, but I just wanted to share some of the questions that were bouncing around in my head in case you guys had suggestions.

@chrisfilo

This comment has been minimized.

Member

chrisfilo commented Mar 18, 2014

Yes the redundancy between nipype_cmd and dipy_fit_tensor is a bit problematic. It really boils down to dependencies. If you would consider keeping nipype interfaces within dipy package and making nipype a optional dependency for those who want to use command line tools it would make code management much easier. I'm sure we can put substantial effort into sliming down nipype import cost to easy this burden. Maybe @satra can weigh in with some his experience.

"ad_map" : "%s_ad.nii.gz",
"rd_map" : "%s_rd.nii.gz",
"dummy_trk_file" : "%s_dummy.trk",
"mask_file" : "%s_mask.nii.gz",

This comment has been minimized.

@Garyfallidis

Garyfallidis Apr 11, 2014

Member

I think the creation of the mask should not happen in the fit_tensor function. The mask creation should be always before fit_tensor as it is done in FSL and mrtrix etc. I mean that the mask should always be provided even if it is ones everywhere.

This comment has been minimized.

@MrBago

MrBago Apr 11, 2014

Contributor

Hi Eleftherios, just to put this in context. This is not an actual pull request, this is just meant to demonstrate how nipype and dipy might work together. I choose this function to work with because I thought it would serve as a good example. I wrote this script a long time ago primary for use here at UCSF and included it in dipy when I got feedback that others thought it was useful. I'm happy to spend some more time making it more useful, but the reason I created this PR and pinged chris on the PR is to follow up on our earlier discussions about nipype dipy interoperability. I really would like to separate out the two discussion, I think discussing the merits of this script and the the interoperability issues in once place makes the conversation very hard to follow.

This comment has been minimized.

@Garyfallidis

Garyfallidis Apr 11, 2014

Member

Yes, I know. Sorry for bringing this here. I just caught the opportunity. I should have written an issue.

@Garyfallidis

This comment has been minimized.

Member

Garyfallidis commented Apr 11, 2014

Okay about the interface design. I think that what you already propose is the way to go. We will start adding functions in the dipy.workflows which can be called by minimal scripts in dipy/bin and the nipy interfaces will be added in the nipype codebase from us or the nipype developers. In that way we share the workload between the dipy and the nipype developers. It will help also communication and sharing between the projects. I am worried that if we keep the nipype interfaces inside dipy we will not make as good interfaces as if they were written or supervised from the nipype developers. Also we kind of struggle with many not-optional or optional dependencies. What do you think @MrBago and @chrisfilo?

@arokem

This comment has been minimized.

Member

arokem commented Jun 24, 2015

OK - this one has languished here for a while, but since this is something that came up in conversations at HBM, it seems like a good opportunity to bring this back up. In these conversations, I became convinced that creating nipype interfaces would be a big win, for many different reasons. However, I think that these interfaces should be minimal and rely mostly on a scripting API implemented within dipy.

First of all, I have a much slimmer proposal for a nipype DTI fitting interface here:

nipy/nipype#1121

The main idea is that no matter what derived measure you want to calculate from DTI, you will always want to calculate these parameters, so start from calculating and caching the tensor parameters, and generate pipelines with further calculations (e.g. FA and MD) to depend on this interface. This is very different from the "calculate everything" approach proposed here.

Tests are not passing, but the failures are hard for me to decode. I think that I might be doing the testing wrong somehow. Still examining that.

I think that the way forward is to create a workflow module in dipy, and create many useful and flexible scripting interfaces in there (not enforcing any particular uniform API, and without OO). It will make the work of interfacing with nipype much easier, and will also make it simpler to write CLI based on argparse, without having to duplicate functionality in different places.

@Garyfallidis

This comment has been minimized.

Member

Garyfallidis commented Dec 30, 2015

Agree with @arokem . Should we close this one?

@Garyfallidis

This comment has been minimized.

Member

Garyfallidis commented Sep 8, 2016

This has been inactive for long time. Closing ...

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment