Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Parallel reconst workflows #1770

Merged
merged 6 commits into from Mar 29, 2019

Conversation

@sitek
Copy link
Contributor

commented Mar 14, 2019

Adding parallelization to peaks_from_models for CSA and CSD in the reconstruction workflows.

@pep8speaks

This comment has been minimized.

Copy link

commented Mar 14, 2019

Hello @sitek, Thank you for updating !

Cheers ! There are no PEP8 issues in this Pull Request. 🍻

Comment last updated at 2019-03-25 21:08:34 UTC
@arokem

This comment has been minimized.

Copy link
Member

commented Mar 14, 2019

Do we have parallelization implemented in any other workflows? I think we could simplify a bit by using just one input parameter: nbr_processes that defaults to False. When this is set to a number > 0, that would be the number of processes to spawn, if set to -1, that means use all cpus. If it's not set, use the default (no parallelization). Would that be consistent with other workflows?

@codecov-io

This comment has been minimized.

Copy link

commented Mar 14, 2019

Codecov Report

❗️ No coverage uploaded for pull request base (master@78f74e8). Click here to learn what that means.
The diff coverage is n/a.

Impacted file tree graph

@@            Coverage Diff            @@
##             master    #1770   +/-   ##
=========================================
  Coverage          ?   83.85%           
=========================================
  Files             ?      120           
  Lines             ?    14572           
  Branches          ?     2295           
=========================================
  Hits              ?    12220           
  Misses            ?     1827           
  Partials          ?      525
Impacted Files Coverage Δ
dipy/workflows/reconst.py 76.89% <ø> (ø)
@skoudoro
Copy link
Member

left a comment

Agree with you @arokem. More general, we need to update peak_from_model.

Otherwise, looks good to me, thanks @sitek!

dipy/workflows/reconst.py Outdated Show resolved Hide resolved
dipy/workflows/reconst.py Outdated Show resolved Hide resolved
@skoudoro

This comment has been minimized.

Copy link
Member

commented Mar 25, 2019

Hi @sitek,

Can you add a test on dipy/workflow/reconst?

After that, this PR it ready to go.

Do we have parallelization implemented in any other workflows?

No, and I think, we should add a global variable for all workflows (like --force). This need to be implemented in a new PR.

When this is set to a number > 0, that would be the number of processes to spawn, if set to -1, that means use all cpus. If it's not set, use the default (no parallelization). Would that be consistent with other workflows?

I agree with this point but we need a specific PR and check all DIPY codebase. Currently we are not consistent and I think it is a bad idea to update it on this PR. What do you think @arokem?

@sitek

This comment has been minimized.

Copy link
Contributor Author

commented Mar 25, 2019

Hi @sitek,

Can you add a test on dipy/workflow/reconst?

After that, this PR it ready to go.

Added (and fixed docs)

@arokem

This comment has been minimized.

Copy link
Member

commented Mar 26, 2019

@skoudoro : we need to make sure that we are consistent in how we define these things, but I agree that this is a matter for a different day...

I think that this PR is ready to go! Will merge in a couple of days, unless anyone objects.

@arokem

This comment has been minimized.

Copy link
Member

commented Mar 29, 2019

CI failure is unrelated. Merging.

@arokem arokem merged commit 5878720 into nipy:master Mar 29, 2019

1 of 2 checks passed

continuous-integration/travis-ci/pr The Travis CI build failed
Details
continuous-integration/appveyor/pr AppVeyor build succeeded
Details
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
5 participants
You can’t perform that action at this time.