New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
crawl of openfmri datasets fails since aggregate-metadata crashes with AttributeError #1930
Comments
Could you run this with |
[DEBUG ] Dump metadata of <Dataset path=/mnt/btrfs/datasets/datalad/crawl/openfmri/ds000001> (merge mode: init) into <Dataset path=/mnt/btrfs/datasets/datalad/crawl>
[DEBUG ] no usable BIDS metadata for CHANGES in <Dataset path=/mnt/btrfs/datasets/datalad/crawl/openfmri/ds000001>: File '/mnt/btrfs/datasets/datalad/crawl/openfmri/ds000001/CHANGES' could not be found in the current BIDS project. [bids_layout.py:get_nearest_helper:33]
[DEBUG ] no usable BIDS metadata for README in <Dataset path=/mnt/btrfs/datasets/datalad/crawl/openfmri/ds000001>: File '/mnt/btrfs/datasets/datalad/crawl/openfmri/ds000001/README' could not be found in the current BIDS project. [bids_layout.py:get_nearest_helper:33]
[DEBUG ] no usable BIDS metadata for participants.tsv in <Dataset path=/mnt/btrfs/datasets/datalad/crawl/openfmri/ds000001>: File '/mnt/btrfs/datasets/datalad/crawl/openfmri/ds000001/participants.tsv' could not be found in the current BIDS project. [bids_layout.py:get_nearest_helper:33]
Traceback (most recent call last):
File "/home/yoh/proj/datalad/datalad-master/venvs/dev/bin/datalad", line 8, in <module>
main()
File "/home/yoh/proj/datalad/datalad-master/datalad/cmdline/main.py", line 347, in main
ret = cmdlineargs.func(cmdlineargs)
File "/home/yoh/proj/datalad/datalad-master/datalad/interface/base.py", line 425, in call_from_parser
ret = cls.__call__(**kwargs)
File "/home/yoh/proj/datalad/datalad-master/datalad/interface/crawl.py", line 130, in __call__
output = run_pipeline(pipeline, stats=stats)
File "/home/yoh/proj/datalad/datalad-master/datalad/crawler/pipeline.py", line 114, in run_pipeline
output = list(xrun_pipeline(*args, **kwargs))
File "/home/yoh/proj/datalad/datalad-master/datalad/crawler/pipeline.py", line 194, in xrun_pipeline
for idata_out, data_out in enumerate(xrun_pipeline_steps(pipeline, data_in, output=output_sub)):
File "/home/yoh/proj/datalad/datalad-master/datalad/crawler/pipeline.py", line 286, in xrun_pipeline_steps
for data_out in xrun_pipeline_steps(pipeline_tail, data_, output=output):
File "/home/yoh/proj/datalad/datalad-master/datalad/crawler/pipeline.py", line 286, in xrun_pipeline_steps
for data_out in xrun_pipeline_steps(pipeline_tail, data_, output=output):
File "/home/yoh/proj/datalad/datalad-master/datalad/crawler/pipeline.py", line 286, in xrun_pipeline_steps
for data_out in xrun_pipeline_steps(pipeline_tail, data_, output=output):
File "/home/yoh/proj/datalad/datalad-master/datalad/crawler/pipeline.py", line 286, in xrun_pipeline_steps
for data_out in xrun_pipeline_steps(pipeline_tail, data_, output=output):
File "/home/yoh/proj/datalad/datalad-master/datalad/crawler/pipeline.py", line 286, in xrun_pipeline_steps
for data_out in xrun_pipeline_steps(pipeline_tail, data_, output=output):
File "/home/yoh/proj/datalad/datalad-master/datalad/crawler/pipeline.py", line 286, in xrun_pipeline_steps
for data_out in xrun_pipeline_steps(pipeline_tail, data_, output=output):
File "/home/yoh/proj/datalad/datalad-master/datalad/crawler/pipeline.py", line 270, in xrun_pipeline_steps
for data_ in data_in_to_loop:
File "/home/yoh/proj/datalad/datalad-master/datalad/crawler/nodes/annex.py", line 1329, in _finalize
aggregate_metadata(dataset='^', path=self.repo.path)
File "/home/yoh/proj/datalad/datalad-master/datalad/interface/utils.py", line 437, in eval_func
return return_func(generator_func)(*args, **kwargs)
File "/home/yoh/proj/datalad/datalad-master/datalad/interface/utils.py", line 425, in return_func
results = list(results)
File "/home/yoh/proj/datalad/datalad-master/datalad/interface/utils.py", line 382, in generator_func
result_renderer, result_xfm, _result_filter, **_kwargs):
File "/home/yoh/proj/datalad/datalad-master/datalad/interface/utils.py", line 449, in _process_results
for res in results:
File "/home/yoh/proj/datalad/datalad-master/datalad/metadata/aggregate.py", line 667, in __call__
to_save)
File "/home/yoh/proj/datalad/datalad-master/datalad/metadata/aggregate.py", line 238, in _extract_metadata
store(meta, objpath)
File "/home/yoh/proj/datalad/datalad-master/datalad/support/json_py.py", line 64, in dump2xzstream
with lzma.LZMAFile(fname, mode='w') as f:
AttributeError: __exit__
()
> /home/yoh/proj/datalad/datalad-master/datalad/support/json_py.py(64)dump2xzstream()
-> with lzma.LZMAFile(fname, mode='w') as f:
(Pdb) p fname
'/mnt/btrfs/datasets/datalad/crawl/openfmri/ds000001/.datalad/metadata/objects/11/cn-ceee6877e54bbbe61688bcfa2dadac'
(Pdb)
[1] + 13262 suspended datalad --dbg -l debug crawl
(dev)3 10569 ->148 [1].....................................:Sun 29 Oct 2017 09:41:08 AM EDT:.
(git)smaug:/mnt/btrfs/datasets/datalad/crawl/openfmri/ds000001[master]git
$> ls -l /mnt/btrfs/datasets/datalad/crawl/openfmri/ds000001/.datalad/metadata/objects/11/cn-ceee6877e54bbbe61688bcfa2dadac
-rw------- 1 yoh datalad 32 Oct 29 09:40 /mnt/btrfs/datasets/datalad/crawl/openfmri/ds000001/.datalad/metadata/objects/11/cn-ceee6877e54bbbe61688bcfa2dadac FWIW, you could trigger it by running
|
I ran this (twice) sucessfully (using current master) and could not replicate the problem.
|
2 tasks
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
The text was updated successfully, but these errors were encountered: