Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Update inlining Futures in task graph in Client._graph_to_futures #3303

Merged
merged 5 commits into from Dec 10, 2019
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Jump to
Jump to file
Failed to load files.
Diff view
Diff view
7 changes: 5 additions & 2 deletions distributed/client.py
Expand Up @@ -51,6 +51,7 @@
WrappedKey,
unpack_remotedata,
pack_data,
subs_multiple,
scatter_to_workers,
gather_from_workers,
retry_operation,
Expand Down Expand Up @@ -2435,10 +2436,12 @@ def _graph_to_futures(
futures = {key: Future(key, self, inform=False) for key in keyset}

values = {
k for k, v in dsk.items() if isinstance(v, Future) and k not in keyset
k: v
for k, v in dsk.items()
if isinstance(v, Future) and k not in keyset
}
if values:
dsk = dask.optimization.inline(dsk, keys=values)
dsk = subs_multiple(dsk, values)

d = {k: unpack_remotedata(v, byte_keys=True) for k, v in dsk.items()}
extra_futures = set.union(*[v[1] for v in d.values()]) if d else set()
Expand Down
16 changes: 15 additions & 1 deletion distributed/tests/test_utils_comm.py
@@ -1,7 +1,7 @@
from distributed.core import ConnectionPool
from distributed.comm import Comm
from distributed.utils_test import gen_cluster, loop # noqa: F401
from distributed.utils_comm import pack_data, gather_from_workers, retry
from distributed.utils_comm import pack_data, subs_multiple, gather_from_workers, retry

from unittest import mock

Expand All @@ -15,6 +15,20 @@ def test_pack_data():
assert pack_data({"a": ["x"], "b": "y"}, data) == {"a": [1], "b": "y"}


def test_subs_multiple():
data = {"x": 1, "y": 2}
assert subs_multiple((sum, [0, "x", "y", "z"]), data) == (sum, [0, 1, 2, "z"])
assert subs_multiple((sum, [0, ["x", "y", "z"]]), data) == (sum, [0, [1, 2, "z"]])

dsk = {"a": (sum, ["x", "y"])}
assert subs_multiple(dsk, data) == {"a": (sum, [1, 2])}

# Tuple key
data = {"x": 1, ("y", 0): 2}
dsk = {"a": (sum, ["x", ("y", 0)])}
assert subs_multiple(dsk, data) == {"a": (sum, [1, 2])}


@gen_cluster(client=True)
def test_gather_from_workers_permissive(c, s, a, b):
rpc = ConnectionPool()
Expand Down
32 changes: 32 additions & 0 deletions distributed/utils_comm.py
Expand Up @@ -280,6 +280,38 @@ def pack_data(o, d, key_types=object):
return o


def subs_multiple(o, d):
""" Perform substitutions on a tasks

Parameters
----------
o:
Core data structures containing literals and keys
d: dict
Mapping of keys to values
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

With the current implementation these have to be str keys, may be worth noting.

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Hrmm that's a really good point. I think we'll want to cover generic, non-string keys too (thinking of, for example, a persisted dask array which has keys like {("chunk", 0): <Future>}).

Since d is a mapping which contains keys in the task graph to substitute, we could first check whether or not o is itself a key in d. That would let us know to make a substitution when we come across a key like, for example, ("chunk", 0). I pushed a commit with what I mean in code

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I think at this point all keys are strings already, but I may be wrong. I don't remember when keys are converted to strings (if it's on the client or the scheduler side), but at some point everything's a string, so the previous code may be fine. I was mostly commenting to update the docstring.

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

The client converts keys to strings before sending to the scheduler. The scheduler only understands string-valued keys.

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Thanks for pointing that out, I hadn't realized this string conversion took place. It looks like the the conversion happens here:

dsk2 = str_graph({k: v[0] for k, v in d.items()}, extra_keys)

a few lines after Futures have been inlined in the graph.

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Just updated subs_multiple with the improvements from @jcrist in #3303 (comment)

Alternatively, we could try moving the str_graph call before substitutions take place. However, one advantage to the current subs_multiple implementation is we could use it elsewhere (on generic keys) should the need arise


Examples
--------
>>> dsk = {"a": (sum, ["x", 2])}
>>> data = {"x": 1}
>>> subs_multiple(dsk, data) # doctest: +SKIP
{'a': (sum, [1, 2])}

"""
typ = type(o)
if typ is tuple and o and callable(o[0]): # istask(o)
return (o[0],) + tuple(subs_multiple(i, d) for i in o[1:])
elif typ is list:
return [subs_multiple(i, d) for i in o]
elif typ is dict:
return {k: subs_multiple(v, d) for (k, v) in o.items()}
else:
try:
return d.get(o, o)
except TypeError:
return o


retry_count = dask.config.get("distributed.comm.retry.count")
retry_delay_min = parse_timedelta(
dask.config.get("distributed.comm.retry.delay.min"), default="s"
Expand Down