-
Notifications
You must be signed in to change notification settings - Fork 125
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
FileNotFoundError: [Errno 2] No such file or directory: '_task-rest_bold.json' #516
Comments
To follow up, using |
ha -- situation is a bit different from #362 -- I guess while the
but I guess it is not something you could try out easily right? |
I thought to suggest meanwhile that you could run all individual conversions indeed with |
Hi Yaroslav, thanks for your help.
Indeed, it isn't something I can try easily with my setup. Nevertheless, |
I ran into the same issue as the original poster @m-petersen -- has any solution for this been implemented yet? |
@burdinskid13 it looks like this bug is still around - the best workaround for now seems to be #516 (comment) |
just to blindly counteract the effect which likely to happen whenever some per-subject process is converting (and just load/saving .json files) while some other top level populate_aggregated_jsons calls out to json_load to "harvest" known information. There it should be safe to retry since then anyways the last one to load/save those top level files will produce the correct one. Hopefully closes nipy#516
sorry for the delay. I have now implemented that workaround in the comment as #523 . I think it should be safe, rapid review would be appreciated. If no objections etc, I will merge tomorrow and kick out a fresh heudiconv version -- it has been awhile |
just to blindly counteract the effect which likely to happen whenever some per-subject process is converting (and just load/saving .json files) while some other top level populate_aggregated_jsons calls out to json_load to "harvest" known information. There it should be safe to retry since then anyways the last one to load/save those top level files will produce the correct one. Hopefully closes nipy#516
just to blindly counteract the effect which likely to happen whenever some per-subject process is converting (and just load/saving .json files) while some other top level populate_aggregated_jsons calls out to json_load to "harvest" known information. There it should be safe to retry since then anyways the last one to load/save those top level files will produce the correct one. Hopefully closes nipy#516
just to blindly counteract the effect which likely to happen whenever some per-subject process is converting (and just load/saving .json files) while some other top level populate_aggregated_jsons calls out to json_load to "harvest" known information. There it should be safe to retry since then anyways the last one to load/save those top level files will produce the correct one. Hopefully closes nipy#516
🚀 Issue was released in |
sorry - referenced this issue incorrectly within #564, so it was released some time before (I guess 0.10.0) |
Summary
I try to implement bids conversion with heudiconv on our local HPC using a singularity container ( heudiconv-0.9.0). As I will work with a large cohort (>2000 subjects) I am currently implementing a parallelization across nodes via SLURM and within each node with GNU parallel with a test dataset (5 times the same subject). In doing so, a test run fails with the following error.Interestingly this affects only 4 of 5 subjects with the remaining one (seemingly always the first subject completing) without issues. All this sounds a little bit like race-condition as discussed in #362. However, as far as I understand a fix has been implemented and I am working with datalad creating a ephemeral clone of the dataset on a scratch partition for each subject before applying heudiconv to it (scripts below). Maybe I am misunderstanding something but shouldn't the latter somehow address the race condition with the processes writing to a separate/cloned top-level file?
Any input would be highly appreciated. Happy to provide further details.
heudiconv_heuristic.txt
pipelines_parallelization.txt (batch script parallelizing pipelines_processing across subjects with GNU parallel)
pipelines_processing.txt
Platform details:
Choose one:
The text was updated successfully, but these errors were encountered: