Skip to content

Conversation

@CerenB
Copy link
Collaborator

@CerenB CerenB commented Jan 7, 2021

The parallel processing seems to be working - the processing time cut short.

group is inside the parfor loop. Is that ok?
Tiny issue is with the copy figure function, see below and also issue #178

Warning: No figure file to copy
> In copyGraphWindownOutput (line 50)
  In copyFigures (line 39)
  In parallel_function>make_general_channel/channel_general (line 932)
  In remoteParallelFunction (line 41)

Remi-Gau and others added 29 commits November 1, 2020 07:06
…add-marcobarilari

docs: add marcobarilari as a contributor
…add-marcobarilari

docs: add marcobarilari as a contributor
…add-fedefalag

docs: add fedefalag as a contributor
…add-CerenB

docs: add CerenB as a contributor
…add-CerenB

docs: add CerenB as a contributor
…add-Remi-Gau

docs: add Remi-Gau as a contributor
@codecov
Copy link

codecov bot commented Jan 7, 2021

Codecov Report

Merging #257 (9557217) into dev (b01afdc) will decrease coverage by 0.06%.
The diff coverage is 26.66%.

Impacted file tree graph

@@            Coverage Diff             @@
##              dev     #257      +/-   ##
==========================================
- Coverage   53.71%   53.65%   -0.07%     
==========================================
  Files         111      111              
  Lines        1936     1944       +8     
==========================================
+ Hits         1040     1043       +3     
- Misses        896      901       +5     
Impacted Files Coverage Δ
src/workflows/bidsSpatialPrepro.m 0.00% <0.00%> (ø)
src/utils/manageWorkersPool.m 19.04% <57.14%> (+12.79%) ⬆️

Continue to review full report at Codecov.

Legend - Click here to learn more
Δ = absolute <relative> (impact), ø = not affected, ? = missing data
Powered by Codecov. Last update b01afdc...9557217. Read the comment docs.

Copy link
Contributor

@Remi-Gau Remi-Gau left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

LGTM.

One suggestion that we could add in here, to make that workflow even more readable.

LMWYT

@Remi-Gau
Copy link
Contributor

Remi-Gau commented Jan 7, 2021

Cool. Thanks for that @CerenB

The parallel processing seems to be working - the processing time cut short.

group is inside the parfor loop. Is that ok?
Tiny issue is with the copy figure function, see below and also issue #178

Warning: No figure file to copy
> In copyGraphWindownOutput (line 50)
  In copyFigures (line 39)
  In parallel_function>make_general_channel/channel_general (line 932)
  In remoteParallelFunction (line 41)

For the figure saving I think I will open a separate issue and address it in another PR.

I think that I am learning that with PR "size matters: small is beautiful"

@CerenB
Copy link
Collaborator Author

CerenB commented Jan 7, 2021

another thing is I was trying in my analysis pipeline (in getOption.m file), the below lines:

opt.parallelize.do = false;
  opt.parallelize.nbWorkers = 4;
  opt.parallelize.killOnExit = false;

But it still goes into the parallel processing. Do you have any guess what I might be doing wrong? I thought with opt.paralleize.do = false would make parfor loop behave as if for loop.

@Remi-Gau
Copy link
Contributor

Remi-Gau commented Jan 7, 2021

another thing is I was trying in my analysis pipeline (in getOption.m file), the below lines:

opt.parallelize.do = false;
  opt.parallelize.nbWorkers = 4;
  opt.parallelize.killOnExit = false;

But it still goes into the parallel processing. Do you have any guess what I might be doing wrong? I thought with opt.paralleize.do = false would make parfor loop behave as if for loop.

Ha yes that's the part that might need some documentation.

If set to false it means that it should only take one core to run the "parallel" loops. Which effectively is therefore NOT parallel then.

This should be set here.

https://github.com/cpp-lln-lab/CPP_SPM/blob/b01afdcb081d23d524f030a02cba82300d9d51b9/src/utils/manageWorkersPool.m#L23

But seeing this now, I get the feeling that I have not coded this properly. Will open an issue to fix that in a separate PR.

@CerenB
Copy link
Collaborator Author

CerenB commented Jan 7, 2021

How can I understand if things are working properly? Any suggestion?

I might be over reading it but when I re-run things with opt.parallelize.do = false there's a warning message popping up, .MAT file is deleted. Please see below.


Segment /Users/battal/Cerens_files/fMRI/Processed/RhythmCateg/Pilots/PitchFT/derivatives/cpp_spm/sub-009/ses-001/anat/sub-009_ses-001_T1w.nii
  In spm_create_vol (line 17)
  In spm_write_vol (line 83)
  In spm_uw_apply (line 331)
  In spm_run_realignunwarp (line 98)
  In cfg_run_cm (line 29)
  In cfg_util>local_runcj (line 1717)
  In cfg_util (line 972)
  In spm_jobman>fill_run_job (line 469)
  In spm_jobman (line 247)
  In saveAndRunWorkflow (line 29)
  In parallel_function>make_general_channel/channel_general (line 932)
  In remoteParallelFunction (line 41)
Warning: Forcing deletion of MAT-file.
> In spm_create_vol>create_vol (line 204)
  In spm_create_vol (line 17)
  In spm_write_vol (line 83)
  In spm_uw_apply (line 331)
  In spm_run_realignunwarp (line 98)
  In cfg_run_cm (line 29)
  In cfg_util>local_runcj (line 1717)
  In cfg_util (line 972)
  In spm_jobman>fill_run_job (line 469)
  In spm_jobman (line 247)
  In saveAndRunWorkflow (line 29)
  In parallel_function>make_general_channel/channel_general (line 932)

@CerenB
Copy link
Collaborator Author

CerenB commented Jan 7, 2021

I've tested by re-running again without the parallel processing and it seems that due to re-running the spatialprep deletes some .MAT files, so all good here. Parallel processing seems to be no harm to our preprocessing 👍

@CerenB
Copy link
Collaborator Author

CerenB commented Jan 7, 2021

I've tested by re-running again without the parallel processing and it seems that due to re-running the spatialprep deletes some .MAT files, so all good here. Parallel processing seems to be no harm to our preprocessing 👍

one tiny question in this line. parallel processing would be the same as opening 4 matlab sessions and running different subjects? thought not sure what would happen if I change 1 script (to change subject number) and save it while it was saved with another subject number and running already. any comments on that?

Co-authored-by: Remi Gau <remi_gau@hotmail.com>
@CerenB CerenB marked this pull request as ready for review January 7, 2021 10:42
@Remi-Gau
Copy link
Contributor

Remi-Gau commented Jan 7, 2021

one tiny question in this line. parallel processing would be the same as opening 4 matlab sessions and running different subjects?

yup: though I suspect it may use resources a bit differently (I don't know how the overhead of memory management differ between having 4 matlab sessions or a single one with a forloop on 4 cores).

thought not sure what would happen if I change 1 script (to change subject number) and save it while it was saved with another subject number and running already. any comments on that?

I am pretty sure it won't affect stuff that that is already launched. But when the stuff that is launched stops, the code you might have changed somewhere will show the update you applied somewhere else.

It can be VERY confusing.

@Remi-Gau
Copy link
Contributor

Remi-Gau commented Jan 7, 2021

another thing is I was trying in my analysis pipeline (in getOption.m file), the below lines:

opt.parallelize.do = false;
  opt.parallelize.nbWorkers = 4;
  opt.parallelize.killOnExit = false;

But it still goes into the parallel processing. Do you have any guess what I might be doing wrong? I thought with opt.paralleize.do = false would make parfor loop behave as if for loop.

hey @CerenB

I send a PR on this your branch to fix this. You can review it on your repo and then when we merge it, it will update this PR automatically.

CerenB#1

Remi-Gau and others added 2 commits January 7, 2021 12:12
make sure that when set to false parallel.do only uses 1 worker
@Remi-Gau
Copy link
Contributor

Remi-Gau commented Jan 7, 2021

If there isn't anything else to add here I think we can merge.

@Remi-Gau
Copy link
Contributor

Remi-Gau commented Jan 7, 2021

If there isn't anything else to add here I think we can merge.

meaning: feel free to press the big green button.

@CerenB
Copy link
Collaborator Author

CerenB commented Jan 7, 2021

I thought so too but I do not have write access to accept the PR. so no green button.

@Remi-Gau
Copy link
Contributor

Remi-Gau commented Jan 7, 2021

I gave you "write" access to the repo. This should do it. :-)

@CerenB CerenB merged commit 448182a into cpp-lln-lab:dev Jan 7, 2021
@Remi-Gau
Copy link
Contributor

Remi-Gau commented Jan 7, 2021

@all-contributors please add @CerenB for code, review

@allcontributors
Copy link
Contributor

@Remi-Gau

I've put up a pull request to add @CerenB! 🎉

@Remi-Gau
Copy link
Contributor

Remi-Gau commented Jan 7, 2021

@all-contributors please add @Remi-Gau for test

@allcontributors
Copy link
Contributor

@Remi-Gau

I've put up a pull request to add @Remi-Gau! 🎉

@CerenB CerenB deleted the cer-spatial_parfor branch June 8, 2021 14:33
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

Parallel workflow still operates on several workers even when do is set to false

2 participants