Doc ipyparallel & mpi4py concurrent.futures#7665
Conversation
|
@jakirkham @jrbourbeau Do you think examples for these cases would be helpful too? As a relatively new to dask person, I was going through the links and I noticed that the mpi4py case is clear, although the ipyparallel doesn't have examples for the executor. |
|
It's a good question. There is a little bit of documentation for Personally think one example with Loky is probably good enough for us (and is also nice for those familiar with the scikit-learn/joblib collection of libraries). The rest of these serve as other executors one might consider based on their needs. Am sure there are probably more that I'm not thinking of that people might want to use for different reasons. |
leofang
left a comment
There was a problem hiding this comment.
Thanks, John! LGTM except for one nitpick.
Co-authored-by: Leo Fang <leofang@bnl.gov>
jrbourbeau
left a comment
There was a problem hiding this comment.
Thanks @jakirkham! This is in
|
Thanks all! 😀 |
|
FWIW here's an example using from functools import partial
import ipyparallel as ipp
import dask
import dask.array as da
c = ipp.Client()
e = c.executor()
submit = e.submit
num_workers = len(e.view)
get = partial(
dask.local.get_async,
submit,
num_workers
)
with dask.config.set(scheduler=get):
a = da.ones((1_000, 1_100), chunks=100)
r = a.sum()
print(r.compute())Ideally we could absorb some of the boilerplate around wrapping the Edit: This should do the trick ( #8112 ). Example usage here ( #8112 (comment) ) |
Mention
ipyparallelandmpi4pyalso support theconcurrent.futuresAPI and could be used as well.black dask/flake8 dask/isort dask