-
-
Notifications
You must be signed in to change notification settings - Fork 717
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Properly handle unknown chunk sizes in P2P rechunking #7856
Conversation
Unit Test ResultsSee test report for an extended history of previous test failures. This is useful for diagnosing flaky tests. 20 files + 1 20 suites +1 11h 0m 33s ⏱️ + 41m 0s For more details on these failures and errors, see this check. Results for commit a7ae007. ± Comparison against base commit 28459a5. This pull request removes 13 and adds 30 tests. Note that renamed tests count towards both.
♻️ This comment has been updated with latest results. |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Thanks @hendrikmakait -- it looks like there are some legitimate rechunking errors here
dask.array.rechunk._old_to_new
old_chunks = x.chunks | ||
new_chunks = chunks | ||
|
||
def is_unknown(dim: ChunkedAxis) -> bool: | ||
return any(math.isnan(chunk) for chunk in dim) | ||
|
||
old_is_unknown = [is_unknown(dim) for dim in old_chunks] | ||
new_is_unknown = [is_unknown(dim) for dim in new_chunks] | ||
|
||
if old_is_unknown != new_is_unknown or any( | ||
new != old for new, old in compress(zip(old_chunks, new_chunks), old_is_unknown) | ||
): | ||
raise ValueError( | ||
"Chunks must be unchanging along dimensions with missing values.\n\n" | ||
"A possible solution:\n x.compute_chunk_sizes()" | ||
) | ||
|
||
old_known = [dim for dim, unknown in zip(old_chunks, old_is_unknown) if not unknown] | ||
new_known = [dim for dim, unknown in zip(new_chunks, new_is_unknown) if not unknown] | ||
|
||
old_sizes = [sum(o) for o in old_known] | ||
new_sizes = [sum(n) for n in new_known] | ||
|
||
if old_sizes != new_sizes: | ||
raise ValueError( | ||
f"Cannot change dimensions from {old_sizes!r} to {new_sizes!r}" | ||
) | ||
|
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
This part is currently copied from dask.array.rechunk.old_to_new
. This should be cleaned up into a follow-up PR that moves validation logic into a helper.
pytest.param( | ||
da.ones(shape=(1000, 10), chunks=(5, 10)), | ||
(None, 5), | ||
marks=pytest.mark.skip(reason="distributed#7757"), | ||
), | ||
pytest.param( | ||
da.ones(shape=(1000, 10), chunks=(5, 10)), | ||
{1: 5}, | ||
marks=pytest.mark.skip(reason="distributed#7757"), | ||
), | ||
pytest.param( | ||
da.ones(shape=(1000, 10), chunks=(5, 10)), | ||
(None, (5, 5)), | ||
marks=pytest.mark.skip(reason="distributed#7757"), | ||
), |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
These parametrizations are currently failing on CI due to horrible performance. My suspicion is that the performance problems are related to (but not necessarily exclusively caused by) #7757
A/B tests did not show any performance impact: https://github.com/coiled/benchmarks/actions/runs/5078894598 |
Sibling to
pre-commit run --all-files