New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Use array size directly when checking multiscale arrays to prevent overflow #5759
Use array size directly when checking multiscale arrays to prevent overflow #5759
Conversation
for more information, see https://pre-commit.ci
Codecov Report
@@ Coverage Diff @@
## main #5759 +/- ##
==========================================
+ Coverage 89.83% 89.86% +0.02%
==========================================
Files 609 610 +1
Lines 52046 52134 +88
==========================================
+ Hits 46755 46848 +93
+ Misses 5291 5286 -5
|
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Nice! Looks good to me 🚀
rerunning a failed test as it seemed spurious |
I've been having issues using I'm not suggesting merging without that passing, but this is turning into a time sink for me. Since this is a pretty minor change that tests an overflow and it is breaking due to testing on min_req libraries, it doesn't feel that valuable to keep investigating this. Also note that the "failure" is more of a timeout/termination instead of a failing assertion. I'm open to being convinced otherwise, but I'm moving on for now. |
I will try to reproduce this on Linux machine these evening. |
@jaimergp suggested in the meeting this might be OOM and the exit codes make me suspicious of that as well. It shouldn't be that much memory really, but you could try this to allocate way less. Maybe one of the older deps is causing this to allocate more? Even on my machine with up-to-date dependencies this change speeds up testing this file about 7x: diff --git a/napari/layers/image/_tests/test_image_utils.py b/napari/layers/image/_tests/test_image_utils.py
index bc0305376..127e028da 100644
--- a/napari/layers/image/_tests/test_image_utils.py
+++ b/napari/layers/image/_tests/test_image_utils.py
@@ -1,5 +1,6 @@
import time
+import dask
import dask.array as da
import numpy as np
import pytest
@@ -76,7 +77,10 @@ def test_guess_multiscale():
# Test for overflow in calculating array sizes
s = 17179869184
- data = [da.ones((s,) * 2), da.ones((s // 2,) * 2)]
+ data = [
+ da.from_delayed(dask.delayed(lambda: None), shape=(s,) * 2, dtype=np.float64),
+ da.from_delayed(dask.delayed(lambda: None), shape=(s // 2,) * 2, dtype=np.float64),
+ ]
assert guess_multiscale(data)[0] Edit: apologies if this is too much nitpick/code-golf but I find this a bit clearer. diff --git a/napari/layers/image/_image_utils.py b/napari/layers/image/_image_utils.py
index 5e85a0d84..2a49559c7 100644
--- a/napari/layers/image/_image_utils.py
+++ b/napari/layers/image/_image_utils.py
@@ -65,12 +65,12 @@ def guess_multiscale(data) -> Tuple[bool, LayerDataProtocol]:
return False, data[0]
shapes = [d.shape for d in data]
- sizes = np.array([np.prod(shape, dtype='object') for shape in shapes])
+ sizes = [d.size for d in data]
if len(sizes) <= 1:
return False, data
- consistent = bool(np.all(sizes[:-1] > sizes[1:]))
- if np.all(sizes == sizes[0]):
+ consistent = all(s1 > s2 for s1, s2 in zip(sizes[:-1], sizes[1:]))
+ if all(s == sizes[0] for s in sizes):
# note: the individual array case should be caught by the first
# code line in this function, hasattr(ndim) and ndim > 1.
raise ValueError( |
Fix was shared here: napari#5759 (comment)
@aganders3, my hero! One thing these fixes highlights is that |
Does this need any other changes? |
After fast reading the current code, I think that the title and description should be updated to reflect changes in this PR. |
Good point, @Czaki! Both have been updated. |
…erflow (#5759) # Fixes/Closes Closes #5758 # Description This PR changes the way array sizes are compared when guessing if data is multiscale. The problem was that `np.prod(a.shape)` was being used which leads to an overflow (see Notes in [numpy.ndarray.size](https://numpy.org/doc/stable/reference/generated/numpy.ndarray.size.html)). That has been changed to be `a.size` instead (ty @aganders3). Additionally, during testing there was an issue with what seemed to be an out of memory error with the Python 3.8 Windows min req job (ty @jaimergp). As a result the test now also uses a Dask delayed array to avoid the large memory allocation (ty @aganders3). ## Type of change - [x] Bug-fix (non-breaking change which fixes an issue) # How has this been tested? - [x] example: the test suite for my feature covers the bug found in #5758 - [x] example: all tests pass with my change ## Final checklist: - [x] My PR is the minimum possible work for the desired functionality - [x] I have commented my code, particularly in hard-to-understand areas - [x] I have added tests that prove my fix is effective or that my feature works --------- Co-authored-by: pre-commit-ci[bot] <66853113+pre-commit-ci[bot]@users.noreply.github.com>
…erflow (#5759) # Fixes/Closes Closes #5758 # Description This PR changes the way array sizes are compared when guessing if data is multiscale. The problem was that `np.prod(a.shape)` was being used which leads to an overflow (see Notes in [numpy.ndarray.size](https://numpy.org/doc/stable/reference/generated/numpy.ndarray.size.html)). That has been changed to be `a.size` instead (ty @aganders3). Additionally, during testing there was an issue with what seemed to be an out of memory error with the Python 3.8 Windows min req job (ty @jaimergp). As a result the test now also uses a Dask delayed array to avoid the large memory allocation (ty @aganders3). ## Type of change - [x] Bug-fix (non-breaking change which fixes an issue) # How has this been tested? - [x] example: the test suite for my feature covers the bug found in #5758 - [x] example: all tests pass with my change ## Final checklist: - [x] My PR is the minimum possible work for the desired functionality - [x] I have commented my code, particularly in hard-to-understand areas - [x] I have added tests that prove my fix is effective or that my feature works --------- Co-authored-by: pre-commit-ci[bot] <66853113+pre-commit-ci[bot]@users.noreply.github.com>
…erflow (#5759) # Fixes/Closes Closes #5758 # Description This PR changes the way array sizes are compared when guessing if data is multiscale. The problem was that `np.prod(a.shape)` was being used which leads to an overflow (see Notes in [numpy.ndarray.size](https://numpy.org/doc/stable/reference/generated/numpy.ndarray.size.html)). That has been changed to be `a.size` instead (ty @aganders3). Additionally, during testing there was an issue with what seemed to be an out of memory error with the Python 3.8 Windows min req job (ty @jaimergp). As a result the test now also uses a Dask delayed array to avoid the large memory allocation (ty @aganders3). ## Type of change - [x] Bug-fix (non-breaking change which fixes an issue) # How has this been tested? - [x] example: the test suite for my feature covers the bug found in #5758 - [x] example: all tests pass with my change ## Final checklist: - [x] My PR is the minimum possible work for the desired functionality - [x] I have commented my code, particularly in hard-to-understand areas - [x] I have added tests that prove my fix is effective or that my feature works --------- Co-authored-by: pre-commit-ci[bot] <66853113+pre-commit-ci[bot]@users.noreply.github.com>
…erflow (#5759) # Fixes/Closes Closes #5758 # Description This PR changes the way array sizes are compared when guessing if data is multiscale. The problem was that `np.prod(a.shape)` was being used which leads to an overflow (see Notes in [numpy.ndarray.size](https://numpy.org/doc/stable/reference/generated/numpy.ndarray.size.html)). That has been changed to be `a.size` instead (ty @aganders3). Additionally, during testing there was an issue with what seemed to be an out of memory error with the Python 3.8 Windows min req job (ty @jaimergp). As a result the test now also uses a Dask delayed array to avoid the large memory allocation (ty @aganders3). ## Type of change - [x] Bug-fix (non-breaking change which fixes an issue) # How has this been tested? - [x] example: the test suite for my feature covers the bug found in #5758 - [x] example: all tests pass with my change ## Final checklist: - [x] My PR is the minimum possible work for the desired functionality - [x] I have commented my code, particularly in hard-to-understand areas - [x] I have added tests that prove my fix is effective or that my feature works --------- Co-authored-by: pre-commit-ci[bot] <66853113+pre-commit-ci[bot]@users.noreply.github.com>
Fixes/Closes
Closes #5758
Description
This PR changes the way array sizes are compared when guessing if data is multiscale.
The problem was that
np.prod(a.shape)
was being used which leads to an overflow (see Notes in numpy.ndarray.size). That has been changed to bea.size
instead (ty @aganders3).Additionally, during testing there was an issue with what seemed to be an out of memory error with the Python 3.8 Windows min req job (ty @jaimergp). As a result the test now also uses a Dask delayed array to avoid the large memory allocation (ty @aganders3).
Type of change
How has this been tested?
as there are small differences between the two Qt bindings.
Final checklist:
trans.
to make them localizable.For more information see our translations guide.