Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

NF: implement a non asyncio-based runner #5667

Merged
merged 26 commits into from
May 21, 2021

Conversation

christian-monch
Copy link
Contributor

@christian-monch christian-monch commented May 17, 2021

PR for an non asyncio-based runner.

This patch implements a thread-based runner, that uses the asyncio protocol interface to communicate
process output to protocol instances.

This change was done to support running datalad commands from processes within a concurrent.futures.ProcessPoolExecutor. This did not work reliably if the datalad command invoked subprocesses, and if asyncio-based subprocess were used. In the asyncio-based implementation processes would randomly wait forever on some synchronization primitive.

Motivation: I am creating a pipeline/orchestration command for metalad that allows a user to specify an information source element, e.g. a dataset-traverser, and a number of processing elements, e.g. a metadata-extractor and a metadata-adder, which would be run on the output of the previous element. So the pipeline would look something like this:

[dataset-traverser->metadata-extractor->metadata-adder]

And I would like to use concurrent.futures (or some other process-based parallelization) to distribute pipeline element processing to multiple cores.

This PR is marked as draft-pull request, since it changes a central piece of datalad, and since I did not yet test it on Windows or MacOS

Closes #5422, #5409, #5100

TODOs:

This patch implements a thread-based runner, that
uses the asyncio protocol interface to communicate
process output to protocol instances
@codecov
Copy link

codecov bot commented May 17, 2021

Codecov Report

Merging #5667 (fe80f1d) into master (8187193) will decrease coverage by 1.87%.
The diff coverage is 98.90%.

❗ Current head fe80f1d differs from pull request most recent head 1424902. Consider uploading reports for the commit 1424902 to get more accurate results
Impacted file tree graph

@@            Coverage Diff             @@
##           master    #5667      +/-   ##
==========================================
- Coverage   90.48%   88.60%   -1.88%     
==========================================
  Files         305      305              
  Lines       41814    41999     +185     
==========================================
- Hits        37834    37212     -622     
- Misses       3980     4787     +807     
Impacted Files Coverage Δ
datalad/cmd_protocols.py 94.44% <94.44%> (ø)
datalad/cmd.py 88.07% <100.00%> (-4.56%) ⬇️
datalad/core/distributed/clone.py 91.95% <100.00%> (+0.08%) ⬆️
datalad/core/distributed/tests/test_clone.py 97.33% <100.00%> (+0.10%) ⬆️
datalad/core/local/status.py 96.42% <100.00%> (+0.03%) ⬆️
datalad/core/local/tests/test_status.py 98.21% <100.00%> (+0.04%) ⬆️
datalad/distribution/tests/test_update.py 98.94% <100.00%> (-1.06%) ⬇️
datalad/distribution/update.py 98.13% <100.00%> (+0.03%) ⬆️
datalad/local/addurls.py 96.39% <100.00%> (-0.67%) ⬇️
datalad/local/subdatasets.py 94.01% <100.00%> (-2.51%) ⬇️
... and 90 more

Continue to review full report at Codecov.

Legend - Click here to learn more
Δ = absolute <relative> (impact), ø = not affected, ? = missing data
Powered by Codecov. Last update 8187193...1424902. Read the comment docs.

Copy link
Member

@mih mih left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Had a first look. Cool!

From my POV this needs some real-world exposure. I will try to use this branch in some of the more tricky tasks, and report back.

THX big time!

There are a number of places where datalad tests examine debug
output. Logging the output of sub processes throws those tests off.
@yarikoptic
Copy link
Member

Woooohooo! I see the light! I am going to it!!! thank you @christian-monch !

only 2 failing tests out of >1k in one of the travis runs
======================================================================
FAIL: datalad.dataset.tests.test_gitrepo.test_gitrepo_call_git_methods
----------------------------------------------------------------------
Traceback (most recent call last):
  File "/tmp/dl-miniconda-q7xiqj3h/lib/python3.8/site-packages/nose/case.py", line 198, in runTest
    self.test(*self.arg)
  File "/tmp/dl-miniconda-q7xiqj3h/lib/python3.8/site-packages/datalad/tests/utils.py", line 585, in _wrap_with_tree
    return t(*(arg + (d,)), **kw)
  File "/tmp/dl-miniconda-q7xiqj3h/lib/python3.8/site-packages/datalad/dataset/tests/test_gitrepo.py", line 242, in test_gitrepo_call_git_methods
    check("fatal: bad source", cml.out)
AssertionError: 'fatal: bad source' unexpectedly found in "[DEBUG] Async run:\n cwd=/tmp/datalad_temp_tree_test_gitrepo_call_git_methodsfoymy59e\n cmd=['git', '-c', 'diff.ignoreSubmodules=none', 'mv', '--', 'notthere', 'dest']\n[DEBUG] Process 25881 started\n[DEBUG] ReaderThread(<_io.FileIO name=14 mode='rb' closefd=True>, <queue.Queue object at 0x7f4f52ff5bb0>) started\n[DEBUG] ReaderThread(<_io.FileIO name=10 mode='rb' closefd=True>, <queue.Queue object at 0x7f4f52ff5bb0>) started\n[DEBUG] ReaderThread(<_io.FileIO name=10 mode='rb' closefd=True>, <queue.Queue object at 0x7f4f52ff5bb0>) exiting (stream end)\n[DEBUG] ReaderThread(<_io.FileIO name=14 mode='rb' closefd=True>, <queue.Queue object at 0x7f4f52ff5bb0>) got data: b'fatal: bad source, source=notthere, destination=dest\\n'\n[DEBUG] ReaderThread(<_io.FileIO name=14 mode='rb' closefd=True>, <queue.Queue object at 0x7f4f52ff5bb0>) exiting (stream end)\n[DEBUG] Process 25881 exited with return code 128\n"
-------------------- >> begin captured logging << --------------------
datalad.cmd: DEBUG: Async run:
 cwd=/tmp/datalad_temp_tree_test_gitrepo_call_git_methodsfoymy59e
 cmd=['git', '-c', 'diff.ignoreSubmodules=none', 'mv', '--', 'notthere', 'dest']
datalad.cmd: DEBUG: Process 25878 started
datalad.runner: DEBUG: ReaderThread(<_io.FileIO name=25 mode='rb' closefd=True>, <queue.Queue object at 0x7f4f52ef9550>) started
datalad.runner: DEBUG: ReaderThread(<_io.FileIO name=23 mode='rb' closefd=True>, <queue.Queue object at 0x7f4f52ef9550>) started
datalad.runner: DEBUG: ReaderThread(<_io.FileIO name=23 mode='rb' closefd=True>, <queue.Queue object at 0x7f4f52ef9550>) exiting (stream end)
datalad.runner: DEBUG: ReaderThread(<_io.FileIO name=25 mode='rb' closefd=True>, <queue.Queue object at 0x7f4f52ef9550>) got data: b'fatal: bad source, source=notthere, destination=dest\n'
datalad.runner: DEBUG: ReaderThread(<_io.FileIO name=25 mode='rb' closefd=True>, <queue.Queue object at 0x7f4f52ef9550>) exiting (stream end)
datalad.cmd: DEBUG: Process 25878 exited with return code 128
datalad.dataset.gitrepo: Level 11: CommandError: 'git -c diff.ignoreSubmodules=none mv -- notthere dest' failed with exitcode 128 under /tmp/datalad_temp_tree_test_gitrepo_call_git_methodsfoymy59e [err: 'fatal: bad source, source=notthere, destination=dest']
datalad.cmd: DEBUG: Async run:
 cwd=/tmp/datalad_temp_tree_test_gitrepo_call_git_methodsfoymy59e
 cmd=['git', '-c', 'diff.ignoreSubmodules=none', 'mv', '--', 'notthere', 'dest']
datalad.cmd: DEBUG: Process 25881 started
datalad.runner: DEBUG: ReaderThread(<_io.FileIO name=14 mode='rb' closefd=True>, <queue.Queue object at 0x7f4f52ff5bb0>) started
datalad.runner: DEBUG: ReaderThread(<_io.FileIO name=10 mode='rb' closefd=True>, <queue.Queue object at 0x7f4f52ff5bb0>) started
datalad.runner: DEBUG: ReaderThread(<_io.FileIO name=10 mode='rb' closefd=True>, <queue.Queue object at 0x7f4f52ff5bb0>) exiting (stream end)
datalad.runner: DEBUG: ReaderThread(<_io.FileIO name=14 mode='rb' closefd=True>, <queue.Queue object at 0x7f4f52ff5bb0>) got data: b'fatal: bad source, source=notthere, destination=dest\n'
datalad.runner: DEBUG: ReaderThread(<_io.FileIO name=14 mode='rb' closefd=True>, <queue.Queue object at 0x7f4f52ff5bb0>) exiting (stream end)
datalad.cmd: DEBUG: Process 25881 exited with return code 128
--------------------- >> end captured logging << ---------------------
======================================================================
FAIL: datalad.support.tests.test_gitrepo.test_gitrepo_call_git_methods
----------------------------------------------------------------------
Traceback (most recent call last):
  File "/tmp/dl-miniconda-q7xiqj3h/lib/python3.8/site-packages/nose/case.py", line 198, in runTest
    self.test(*self.arg)
  File "/tmp/dl-miniconda-q7xiqj3h/lib/python3.8/site-packages/datalad/tests/utils.py", line 585, in _wrap_with_tree
    return t(*(arg + (d,)), **kw)
  File "/tmp/dl-miniconda-q7xiqj3h/lib/python3.8/site-packages/datalad/support/tests/test_gitrepo.py", line 1632, in test_gitrepo_call_git_methods
    check("fatal: bad source", cml.out)
AssertionError: 'fatal: bad source' unexpectedly found in "[DEBUG] Async run:\n cwd=/tmp/datalad_temp_tree_test_gitrepo_call_git_methodsxtvk6i7c\n cmd=['git', '-c', 'diff.ignoreSubmodules=none', 'mv', '--', 'notthere', 'dest']\n[DEBUG] Process 4319 started\n[DEBUG] ReaderThread(<_io.FileIO name=14 mode='rb' closefd=True>, <queue.Queue object at 0x7f4f42da0430>) started\n[DEBUG] ReaderThread(<_io.FileIO name=10 mode='rb' closefd=True>, <queue.Queue object at 0x7f4f42da0430>) started\n[DEBUG] ReaderThread(<_io.FileIO name=14 mode='rb' closefd=True>, <queue.Queue object at 0x7f4f42da0430>) got data: b'fatal: bad source, source=notthere, destination=dest\\n'\n[DEBUG] ReaderThread(<_io.FileIO name=10 mode='rb' closefd=True>, <queue.Queue object at 0x7f4f42da0430>) exiting (stream end)\n[DEBUG] ReaderThread(<_io.FileIO name=14 mode='rb' closefd=True>, <queue.Queue object at 0x7f4f42da0430>) exiting (stream end)\n[DEBUG] Process 4319 exited with return code 128\n"
-------------------- >> begin captured logging << --------------------
datalad.cmd: DEBUG: Async run:
 cwd=/tmp/datalad_temp_tree_test_gitrepo_call_git_methodsxtvk6i7c
 cmd=['git', '-c', 'diff.ignoreSubmodules=none', 'mv', '--', 'notthere', 'dest']
datalad.cmd: DEBUG: Process 4316 started
datalad.runner: DEBUG: ReaderThread(<_io.FileIO name=125 mode='rb' closefd=True>, <queue.Queue object at 0x7f4f4291e280>) started
datalad.runner: DEBUG: ReaderThread(<_io.FileIO name=123 mode='rb' closefd=True>, <queue.Queue object at 0x7f4f4291e280>) started
datalad.runner: DEBUG: ReaderThread(<_io.FileIO name=125 mode='rb' closefd=True>, <queue.Queue object at 0x7f4f4291e280>) got data: b'fatal: bad source, source=notthere, destination=dest\n'
datalad.runner: DEBUG: ReaderThread(<_io.FileIO name=125 mode='rb' closefd=True>, <queue.Queue object at 0x7f4f4291e280>) exiting (stream end)
datalad.runner: DEBUG: ReaderThread(<_io.FileIO name=123 mode='rb' closefd=True>, <queue.Queue object at 0x7f4f4291e280>) exiting (stream end)
datalad.cmd: DEBUG: Process 4316 exited with return code 128
datalad.dataset.gitrepo: Level 11: CommandError: 'git -c diff.ignoreSubmodules=none mv -- notthere dest' failed with exitcode 128 under /tmp/datalad_temp_tree_test_gitrepo_call_git_methodsxtvk6i7c [err: 'fatal: bad source, source=notthere, destination=dest']
datalad.cmd: DEBUG: Async run:
 cwd=/tmp/datalad_temp_tree_test_gitrepo_call_git_methodsxtvk6i7c
 cmd=['git', '-c', 'diff.ignoreSubmodules=none', 'mv', '--', 'notthere', 'dest']
datalad.cmd: DEBUG: Process 4319 started
datalad.runner: DEBUG: ReaderThread(<_io.FileIO name=14 mode='rb' closefd=True>, <queue.Queue object at 0x7f4f42da0430>) started
datalad.runner: DEBUG: ReaderThread(<_io.FileIO name=10 mode='rb' closefd=True>, <queue.Queue object at 0x7f4f42da0430>) started
datalad.runner: DEBUG: ReaderThread(<_io.FileIO name=14 mode='rb' closefd=True>, <queue.Queue object at 0x7f4f42da0430>) got data: b'fatal: bad source, source=notthere, destination=dest\n'
datalad.runner: DEBUG: ReaderThread(<_io.FileIO name=10 mode='rb' closefd=True>, <queue.Queue object at 0x7f4f42da0430>) exiting (stream end)
datalad.runner: DEBUG: ReaderThread(<_io.FileIO name=14 mode='rb' closefd=True>, <queue.Queue object at 0x7f4f42da0430>) exiting (stream end)
datalad.cmd: DEBUG: Process 4319 exited with return code 128

Left a few tiny individual comments. Would be nice if tried within jupyter and also absorbed the test test_popen_invocation demonstrating the issue https://github.com/datalad/datalad/pull/5369/files#diff-4aa1a9688db72f7a475890e0ee6a343ea98af70ccc3bfa66f17da82c819dfa8dR230

Co-authored-by: Yaroslav Halchenko <debian@onerussian.com>
@christian-monch
Copy link
Contributor Author

only 2 failing tests out of >1k in one of the travis runs

Those should be fixed with commit d64e61d (the reader threads debug-logged the data coming from the sub-process, which confuses some tests that capture all output).

@yarikoptic
Copy link
Member

FWIW, tried on windows VM conda install jupyter lab -- although I have not exhibited all the issues others saw, for me it just worked well with this PR. COOL!

Copy link
Member

@yarikoptic yarikoptic left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I know it is just a draft PR, but I am too excited ;-) Left a few minor comments.

datalad/nonasyncrunner.py Outdated Show resolved Hide resolved
datalad/nonasyncrunner.py Outdated Show resolved Hide resolved
datalad/nonasyncrunner.py Show resolved Hide resolved
datalad/nonasyncrunner.py Outdated Show resolved Hide resolved
@yarikoptic
Copy link
Member

I have adjusted description with issues this PR would close and TODOs in my view to accomplish (easier to keep track of them when present in the top description) - they could always be addressed or argued against and crossed out ;)

Re [dataset-traverser->metadata-extractor->metadata-adder] motivation -- I think it aligns very well with ProducerConsumer pattern we already have (https://github.com/datalad/datalad/blob/master/datalad/support/parallel.py#L71) and if bottleneck is IO and not CPU - it could possibly be reused as is, or RFed to

  • support sub-process not Thread mode of operation
  • enhanced/subclassed? to have some "Reducer" for the final singular metadata-adder, although I think it could just sit on top of it and consume all produced records from ProducerConsumer to do its "reduce/addition/whatnot" operation in the same process (unless indeed desired to have a separate thread or subprocess)

datalad/nonasyncrunner.py Outdated Show resolved Hide resolved
@kyleam
Copy link
Contributor

kyleam commented May 17, 2021

This is great to see. Thanks, @christian-monch.

Obviously this addressing outstanding problems is more important, but it's worth noting that the benchmarks are looking good too:

benchmarks
2021-05-17T10:23:45.3349457Z -      3.41±0.1ms      2.74±0.03ms     0.80  core.witlessrunner.time_echo
2021-05-17T10:23:45.3350512Z        4.03±0.2ms       3.33±0.3ms    ~0.83  core.witlessrunner.time_echo_gitrunner
2021-05-17T10:23:45.3351661Z -     3.95±0.05ms      3.57±0.06ms     0.90  core.witlessrunner.time_echo_gitrunner_fullcapture
2021-05-17T13:21:00.1262025Z        2.97±0.2ms       2.73±0.2ms     0.92  core.witlessrunner.time_echo
2021-05-17T13:21:00.1263348Z        3.60±0.5ms       3.06±0.4ms    ~0.85  core.witlessrunner.time_echo_gitrunner
2021-05-17T13:21:00.1264621Z        3.81±0.2ms      3.38±0.06ms    ~0.89  core.witlessrunner.time_echo_gitrunner_fullcapture
2021-05-17T13:28:29.8606181Z -      2.40±0.1ms      2.00±0.02ms     0.83  core.witlessrunner.time_echo
2021-05-17T13:28:29.8607182Z -      2.78±0.2ms      2.50±0.06ms     0.90  core.witlessrunner.time_echo_gitrunner
2021-05-17T13:28:29.8609269Z        2.92±0.2ms      2.64±0.06ms    ~0.90  core.witlessrunner.time_echo_gitrunner_fullcapture

https://github.com/datalad/datalad/runs/2599712075
https://github.com/datalad/datalad/runs/2600966706
https://github.com/datalad/datalad/runs/2601062427

christian-monch and others added 2 commits May 17, 2021 21:25
Co-authored-by: Yaroslav Halchenko <debian@onerussian.com>
Co-authored-by: Yaroslav Halchenko <debian@onerussian.com>
Copy link
Member

@bpoldrack bpoldrack left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Cool, that's what I wanted to see.
What I'd love to see in addition, is a WriterThread as well. ;-)

datalad/nonasyncrunner.py Show resolved Hide resolved
datalad/nonasyncrunner.py Outdated Show resolved Hide resolved
christian-monch and others added 5 commits May 18, 2021 20:47
BrokenPipeError is only raised on writing to a pipe,
we do not have to check for it here.
The done_future-argument is no longer used in
WitlessRunner instances. Deprecate it.
@christian-monch
Copy link
Contributor Author

christian-monch commented May 19, 2021

@yarikoptic: I checked the ability of pycrunch-engine to execute test_ria_basic with this PR in a python 3.6 environment. Seems fine (#5100 (comment)). If I understand task 3 correctly, you would like a test that runs uses an async context and executed the runner successfully within this context? I added such a test, so I consider task 3 done ;-)

Co-authored-by: Yaroslav Halchenko <debian@onerussian.com>
@yarikoptic
Copy link
Member

sorry that my suggestion for bd7bb59 was not entire kosher . it is what broke CI and running tests locally. let me know if you need a hand to mitigate

@yarikoptic
Copy link
Member

oh no!!! the plague of stalling is upon us? please NO! looking at red appveyor run:

datalad.distribution.tests.test_publish.test_publish_simple ... ok
datalad.distribution.tests.test_publish.test_publish_with_data ... 

@yarikoptic
Copy link
Member

FTR: benchmarks are still happy and suggestive of improvement for quickly running commands:

      2.55±0.06ms      2.20±0.07ms    ~0.86  core.witlessrunner.time_echo
       3.08±0.1ms      2.65±0.04ms    ~0.86  core.witlessrunner.time_echo_gitrunner

@christian-monch
Copy link
Contributor Author

christian-monch commented May 20, 2021

@yarikoptic: from my point of view this PR is ready. It seems that the tests that are executed with DATALAD_LOG_LEVEL=2 do not contribute to the coverage database. @kyleam Maybe you know why?

datalad/cmd.py Outdated Show resolved Hide resolved
@yarikoptic
Copy link
Member

Great job @christian-monch ! This PR is nearly perfectly done, and could only be made a bit more kosher (left 2 comments on that) so we do not need to deal with indigestion later on ;-)

@christian-monch
Copy link
Contributor Author

Great job @christian-monch ! This PR is nearly perfectly done, and could only be made a bit more kosher (left 2 comments on that) so we do not need to deal with indigestion later on ;-)

Both dealt with :-)

@yarikoptic
Copy link
Member

FWIW, note on benchmarks -- short commands get consistently faster but surprisingly the beast of a studyforrest one is consistently slower a bit:

(tinuous-dev) datalad@smaug:/mnt/datasets/datalad/ci/logs/2021/05$ grep 'usecases.study_forrest.time_make_studyforrest_mockup' */pr/*5667/*/*success/*/8*txt
19/pr/5667/d1888cf/github-Benchmarks-3085-success/vs-master/8_Compare.txt:2021-05-19T20:42:29.0192372Z         45.5±0.1s       48.7±0.04s     1.07  usecases.study_forrest.time_make_studyforrest_mockup
20/pr/5667/1424902/github-Benchmarks-3108-success/vs-master/8_Compare.txt:2021-05-20T19:50:02.2174723Z         40.4±0.1s        43.3±0.2s     1.07  usecases.study_forrest.time_make_studyforrest_mockup
20/pr/5667/144f8f9/github-Benchmarks-3099-success/vs-master/8_Compare.txt:2021-05-20T11:25:36.2577054Z         43.0±0.1s        47.0±0.1s     1.09  usecases.study_forrest.time_make_studyforrest_mockup
20/pr/5667/7f18e9f/github-Benchmarks-3092-success/vs-master/8_Compare.txt:2021-05-20T08:15:15.2595110Z         44.2±0.2s        47.0±0.5s     1.06  usecases.study_forrest.time_make_studyforrest_mockup
20/pr/5667/7f6cc85/github-Benchmarks-3094-success/vs-master/8_Compare.txt:2021-05-20T10:19:06.5116745Z         45.2±0.6s        51.2±0.5s    ~1.13  usecases.study_forrest.time_make_studyforrest_mockup
20/pr/5667/e8e43ff/github-Benchmarks-3100-success/vs-master/8_Compare.txt:2021-05-20T12:59:23.9873648Z        42.9±0.03s       45.9±0.02s     1.07  usecases.study_forrest.time_make_studyforrest_mockup
(tinuous-dev) datalad@smaug:/mnt/datasets/datalad/ci/logs/2021/05$ grep 'core.witlessrunner.time_echo_gitrunner' */pr/*5667/*/*success/*/8*txt
19/pr/5667/d1888cf/github-Benchmarks-3085-success/vs-master/8_Compare.txt:2021-05-19T20:42:29.0181216Z -     3.32±0.08ms       2.80±0.1ms     0.84  core.witlessrunner.time_echo_gitrunner
19/pr/5667/d1888cf/github-Benchmarks-3085-success/vs-master/8_Compare.txt:2021-05-19T20:42:29.0182450Z       3.51±0.03ms      3.27±0.05ms     0.93  core.witlessrunner.time_echo_gitrunner_fullcapture
20/pr/5667/1424902/github-Benchmarks-3108-success/vs-master/8_Compare.txt:2021-05-20T19:50:02.2164221Z       2.79±0.05ms      2.45±0.04ms    ~0.88  core.witlessrunner.time_echo_gitrunner
20/pr/5667/1424902/github-Benchmarks-3108-success/vs-master/8_Compare.txt:2021-05-20T19:50:02.2165249Z        3.06±0.1ms      2.77±0.05ms    ~0.90  core.witlessrunner.time_echo_gitrunner_fullcapture
20/pr/5667/144f8f9/github-Benchmarks-3099-success/vs-master/8_Compare.txt:2021-05-20T11:25:36.2564968Z       2.98±0.05ms       2.62±0.1ms    ~0.88  core.witlessrunner.time_echo_gitrunner
20/pr/5667/144f8f9/github-Benchmarks-3099-success/vs-master/8_Compare.txt:2021-05-20T11:25:36.2566224Z       3.26±0.04ms      3.10±0.09ms     0.95  core.witlessrunner.time_echo_gitrunner_fullcapture
20/pr/5667/7f18e9f/github-Benchmarks-3092-success/vs-master/8_Compare.txt:2021-05-20T08:15:15.2581174Z -     3.09±0.09ms      2.76±0.04ms     0.89  core.witlessrunner.time_echo_gitrunner
20/pr/5667/7f18e9f/github-Benchmarks-3092-success/vs-master/8_Compare.txt:2021-05-20T08:15:15.2582573Z        3.37±0.2ms      3.11±0.09ms     0.92  core.witlessrunner.time_echo_gitrunner_fullcapture
20/pr/5667/7f6cc85/github-Benchmarks-3094-success/vs-master/8_Compare.txt:2021-05-20T10:19:06.5105150Z       3.17±0.06ms      2.94±0.04ms     0.93  core.witlessrunner.time_echo_gitrunner
20/pr/5667/7f6cc85/github-Benchmarks-3094-success/vs-master/8_Compare.txt:2021-05-20T10:19:06.5106280Z       3.31±0.06ms       3.18±0.2ms     0.96  core.witlessrunner.time_echo_gitrunner_fullcapture
20/pr/5667/e8e43ff/github-Benchmarks-3100-success/vs-master/8_Compare.txt:2021-05-20T12:59:23.9861584Z        3.08±0.1ms      2.65±0.04ms    ~0.86  core.witlessrunner.time_echo_gitrunner
20/pr/5667/e8e43ff/github-Benchmarks-3100-success/vs-master/8_Compare.txt:2021-05-20T12:59:23.9862804Z       3.21±0.06ms       2.98±0.2ms     0.93  core.witlessrunner.time_echo_gitrunner_fullcapture

@yarikoptic
Copy link
Member

ok, @mih supported my motion, so totals to 2 approvals, let's proceed and fix if there is anything to fix after

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
semver-minor Increment the minor version when merged
Projects
None yet
Development

Successfully merging this pull request may close these issues.

NotImplementedError: nest_asyncio.apply() for datalad.api not sufficient
6 participants