Skip to content

Commit

Permalink
Drop vendored compatibility code (#54)
Browse files Browse the repository at this point in the history
* Bump Dask requirement to 0.16.1+

The Dask 0.13.0 requirement was in place before as that was a pretty
stable and somewhat old Dask version at the time and Python 3.4 packages
existed for it on conda-forge. Also it was believed this should make it
easier for other projects to use this Dask-based image processing code.

However many changes were pushed upstream to Dask when this was being
developed, which were largely vendored or worked around in this library
through compatibility code. In some cases the upstream implementations
have seen notable performance improvements. In other cases,
functionality (like Dask Arrays of unknown shape) have just become
easier to work with generally with fewer hacks. Of course there is also
the maintenance that this extra code entails, which is not really worth
the effort given upstream often has superior implementations.Generally
having this compatibility code is becoming a bit of a hindrance.

Also Python 3.4 is not being supported by Dask further or many of the
projects downstream of it. Further some of these projects already
require substantially newer versions of Dask to function. As we may have
similar implementations of things that differ as noted above, this can
be costly in terms of interoperability. So supporting such old versions
of Dask is really not helpful for others wanting to use this library.

Given all of this, it makes sense to bump the Dask minimum version for
better performance, a simpler code base, and nicer interoperability with
other libraries. The minimum version of Dask that would help us
eliminate all compatibility code would be 0.16.1. While a newer version
could be used for other new features, none of these are needed yet. As
0.16.1 came out early this year, it has been around for a while. It is
also the last patch release of the 0.16 series. Plus 0.16.0 introduces
the Dask Collections interface to the public API, which makes it easier
to construct custom graphs should they be needed. Thus go ahead and bump
the minimum Dask version to 0.16.1.

* Drop all `_compat` modules

As the `_compat` modules were added to provide Dask Array functionality
not yet included in the Dask version we were using, they are now
unneeded given our minimum version of Dask has been bumped up to a
version that includes all of this functionality. So just remove the
`_compat` modules from the codebase.

* Drop tests of compatibility code

As the compatibility modules have been removed, there is no need to hang
on to tests of their functionality. Instead simply drop them as well.

* Replace compatibility functions with Dask ones

As all of the compatibility functions are now in Dask in some form,
replace them all with their Dask Array equivalent.

* Use `meshgrid` in `_get_freq_grid`

Instead of rolling our own version of `meshgrid` in `_get_freq_grid` to
construct the frequency grid, simply pass the results from `fftfreq`
onto `meshgrid`. This drastically cuts down on the amount of code here
and benefits from all of the optimizations upstream has implemented in
`meshgrid`, which we did not have here with our `repeat`-based
implementation.

* Use Dask Array's `where` directly

We had wrapped Dask Array's `where` to perform a quick optimization if
the condition was a simple scalar (e.g. `True` or `False`). In this
special case, we merely selected the correct result and returned it
instead of performing a lengthy computation, whose result was already
known during graph computation. This optimization has long since made
its way into Dask Array. So there is no value in us holding onto this
wrapper code any longer. Hence it is dropped.

* Drop tests for the `ndmorph._ops` module

The only test of `ndmorph._ops` was for the `_where` function, which we
no longer need. The other functionality in there is a wrapper that is
already effectively tested by the construction of all `ndmorph`
functions and their tests. Having an explicit test of the wrapper makes
no sense since it already occurs through the functions it is used to
make. Given this, just drop this test module since it is no longer
needed.

* Remove map_overlap 0-depth workaround

Upstream has since incorporated a change to ensure setting the depth to
`0` in `map_overlap` does not cause an error (for cases other than
`reflect`). As we are bumping the Dask version, we no longer need this
workaround. Hence this drops it to cleanup some code and simplify
maintenance. Also updates the corresponding test to stop checking that
this this workaround is used.
  • Loading branch information
jakirkham committed Sep 3, 2018
1 parent cbbcea8 commit aab8434
Show file tree
Hide file tree
Showing 23 changed files with 24 additions and 286 deletions.
2 changes: 1 addition & 1 deletion .appveyor_support/environments/tst_py27.yml
Original file line number Diff line number Diff line change
Expand Up @@ -9,7 +9,7 @@ dependencies:
- wheel==0.31.1
- coverage==4.5.1
- pytest==3.0.5
- dask==0.13.0
- dask==0.16.1
- numpy==1.11.3
- scipy==0.19.1
- scikit-image=0.12.3
Expand Down
2 changes: 1 addition & 1 deletion .appveyor_support/environments/tst_py35.yml
Original file line number Diff line number Diff line change
Expand Up @@ -9,7 +9,7 @@ dependencies:
- wheel==0.31.1
- coverage==4.5.1
- pytest==3.0.5
- dask==0.13.0
- dask==0.16.1
- numpy==1.11.3
- scipy==0.19.1
- scikit-image=0.12.3
Expand Down
2 changes: 1 addition & 1 deletion .appveyor_support/environments/tst_py36.yml
Original file line number Diff line number Diff line change
Expand Up @@ -9,7 +9,7 @@ dependencies:
- wheel==0.31.1
- coverage==4.5.1
- pytest==3.0.5
- dask==0.13.0
- dask==0.16.1
- numpy==1.11.3
- scipy==0.19.1
- scikit-image=0.12.3
Expand Down
2 changes: 1 addition & 1 deletion .circleci/environments/tst_py27.yml
Original file line number Diff line number Diff line change
Expand Up @@ -9,7 +9,7 @@ dependencies:
- wheel==0.31.1
- coverage==4.5.1
- pytest==3.0.5
- dask==0.13.0
- dask==0.16.1
- numpy==1.11.3
- scipy==0.19.1
- scikit-image=0.12.3
Expand Down
2 changes: 1 addition & 1 deletion .circleci/environments/tst_py35.yml
Original file line number Diff line number Diff line change
Expand Up @@ -9,7 +9,7 @@ dependencies:
- wheel==0.31.1
- coverage==4.5.1
- pytest==3.0.5
- dask==0.13.0
- dask==0.16.1
- numpy==1.11.3
- scipy==0.19.1
- scikit-image=0.12.3
Expand Down
2 changes: 1 addition & 1 deletion .circleci/environments/tst_py36.yml
Original file line number Diff line number Diff line change
Expand Up @@ -9,7 +9,7 @@ dependencies:
- wheel==0.31.1
- coverage==4.5.1
- pytest==3.0.5
- dask==0.13.0
- dask==0.16.1
- numpy==1.11.3
- scipy==0.19.1
- scikit-image=0.12.3
Expand Down
2 changes: 1 addition & 1 deletion .travis_support/environments/tst_py27.yml
Original file line number Diff line number Diff line change
Expand Up @@ -9,7 +9,7 @@ dependencies:
- wheel==0.31.1
- coverage==4.5.1
- pytest==3.0.5
- dask==0.13.0
- dask==0.16.1
- numpy==1.11.3
- scipy==0.19.1
- scikit-image=0.12.3
Expand Down
2 changes: 1 addition & 1 deletion .travis_support/environments/tst_py35.yml
Original file line number Diff line number Diff line change
Expand Up @@ -9,7 +9,7 @@ dependencies:
- wheel==0.31.1
- coverage==4.5.1
- pytest==3.0.5
- dask==0.13.0
- dask==0.16.1
- numpy==1.11.3
- scipy==0.19.1
- scikit-image=0.12.3
Expand Down
2 changes: 1 addition & 1 deletion .travis_support/environments/tst_py36.yml
Original file line number Diff line number Diff line change
Expand Up @@ -9,7 +9,7 @@ dependencies:
- wheel==0.31.1
- coverage==4.5.1
- pytest==3.0.5
- dask==0.13.0
- dask==0.16.1
- numpy==1.11.3
- scipy==0.19.1
- scikit-image=0.12.3
Expand Down
8 changes: 0 additions & 8 deletions dask_image/ndfilters/_utils.py
Original file line number Diff line number Diff line change
Expand Up @@ -90,14 +90,6 @@ def _get_depth_boundary(ndim, depth, boundary=None):
if not all(map(type_check, boundary.values())):
raise TypeError("Expected string-like values for `boundary`.")

# Workaround for a bug in Dask with 0 depth.
#
# ref: https://github.com/dask/dask/issues/2258
#
for i in irange(ndim):
if boundary[i] == "none" and depth[i] == 0:
boundary[i] = "reflect"

return depth, boundary


Expand Down
3 changes: 1 addition & 2 deletions dask_image/ndfourier/__init__.py
Original file line number Diff line number Diff line change
Expand Up @@ -8,7 +8,6 @@

import dask.array

from . import _compat
from . import _utils


Expand Down Expand Up @@ -200,7 +199,7 @@ def fourier_uniform(input, size, n=-1, axis=-1):
)

# Compute uniform filter
uniform = _compat._sinc(
uniform = dask.array.sinc(
size[(slice(None),) + input.ndim * (None,)] * freq_grid
)
uniform = dask.array.prod(uniform, axis=0)
Expand Down
66 changes: 0 additions & 66 deletions dask_image/ndfourier/_compat.py

This file was deleted.

25 changes: 6 additions & 19 deletions dask_image/ndfourier/_utils.py
Original file line number Diff line number Diff line change
Expand Up @@ -8,8 +8,7 @@

import dask.array

from . import _compat
from .._pycompat import imap, irange
from .._pycompat import imap, irange, izip


def _get_freq_grid(shape, chunks, dtype=float):
Expand All @@ -21,23 +20,11 @@ def _get_freq_grid(shape, chunks, dtype=float):
assert (issubclass(dtype, numbers.Real) and
not issubclass(dtype, numbers.Integral))

ndim = len(shape)

freq_grid = []
for i in irange(ndim):
sl = ndim * [None]
sl[i] = slice(None)
sl = tuple(sl)

freq_grid_i = _compat._fftfreq(shape[i], chunks=chunks[i])
freq_grid_i = freq_grid_i.astype(dtype)
freq_grid_i = freq_grid_i[sl]

for j in itertools.chain(range(i), range(i + 1, ndim)):
freq_grid_i = freq_grid_i.repeat(shape[j], axis=j)

freq_grid.append(freq_grid_i)

freq_grid = [
dask.array.fft.fftfreq(s, chunks=c).astype(dtype)
for s, c in izip(shape, chunks)
]
freq_grid = dask.array.meshgrid(*freq_grid, indexing="ij")
freq_grid = dask.array.stack(freq_grid)

return freq_grid
Expand Down
3 changes: 1 addition & 2 deletions dask_image/ndmeasure/__init__.py
Original file line number Diff line number Diff line change
Expand Up @@ -15,7 +15,6 @@

import dask.array

from . import _compat
from .. import _pycompat
from . import _utils

Expand Down Expand Up @@ -203,7 +202,7 @@ def label(input, structure=None):
How many objects were found.
"""

input = _compat._asarray(input)
input = dask.array.asarray(input)

if not all([len(c) == 1 for c in input.chunks]):
warn("``input`` does not have 1 chunk in all dimensions; it will be consolidated first", RuntimeWarning)
Expand Down
31 changes: 0 additions & 31 deletions dask_image/ndmeasure/_compat.py

This file was deleted.

7 changes: 3 additions & 4 deletions dask_image/ndmeasure/_utils.py
Original file line number Diff line number Diff line change
Expand Up @@ -9,7 +9,6 @@
import dask
import dask.array

from . import _compat
from .. import _pycompat


Expand All @@ -18,7 +17,7 @@ def _norm_input_labels_index(input, labels=None, index=None):
Normalize arguments to a standard form.
"""

input = _compat._asarray(input)
input = dask.array.asarray(input)

if labels is None:
labels = dask.array.ones(input.shape, dtype=int, chunks=input.chunks)
Expand All @@ -27,8 +26,8 @@ def _norm_input_labels_index(input, labels=None, index=None):
labels = (labels > 0).astype(int)
index = dask.array.ones(tuple(), dtype=int, chunks=tuple())

labels = _compat._asarray(labels)
index = _compat._asarray(index)
labels = dask.array.asarray(labels)
index = dask.array.asarray(index)

if index.ndim > 1:
warnings.warn(
Expand Down
13 changes: 1 addition & 12 deletions dask_image/ndmorph/_ops.py
Original file line number Diff line number Diff line change
Expand Up @@ -9,17 +9,6 @@
from .._pycompat import irange


def _where(condition, x, y):
if isinstance(condition, (bool, numpy.bool8)):
dtype = numpy.promote_types(x.dtype, y.dtype)
if condition:
return x.astype(dtype)
else:
return y.astype(dtype)
else:
return dask.array.where(condition, x, y)


def _binary_op(func,
input,
structure=None,
Expand Down Expand Up @@ -49,6 +38,6 @@ def _binary_op(func,
origin=origin,
**kwargs
)
result = _where(mask, iter_result, result)
result = dask.array.where(mask, iter_result, result)

return result
2 changes: 1 addition & 1 deletion environment_doc.yml
Original file line number Diff line number Diff line change
Expand Up @@ -8,7 +8,7 @@ dependencies:
- wheel==0.31.1
- sphinx==1.7.5
- sphinx_rtd_theme==0.4.1
- dask==0.13.0
- dask==0.16.1
- numpy==1.11.3
- pims==0.4.1
- slicerator==0.9.8
Expand Down
2 changes: 1 addition & 1 deletion setup.py
Original file line number Diff line number Diff line change
Expand Up @@ -29,7 +29,7 @@ def run_tests(self):
history = history_file.read()

requirements = [
"dask[array] >=0.13.0",
"dask[array] >=0.16.1",
"numpy >=1.11.3",
"scipy >=0.19.1",
"pims >=0.4.1",
Expand Down
2 changes: 1 addition & 1 deletion tests/test_dask_image/test_ndfilters/test__utils.py
Original file line number Diff line number Diff line change
Expand Up @@ -136,7 +136,7 @@ def test_errs__get_footprint(err_type, ndim, size, footprint):
@pytest.mark.parametrize(
"expected, ndim, depth, boundary",
[
(({0: 0}, {0: "reflect"}), 1, 0, "none"),
(({0: 0}, {0: "none"}), 1, 0, "none"),
(({0: 0}, {0: "reflect"}), 1, 0, "reflect"),
(({0: 0}, {0: "periodic"}), 1, 0, "periodic"),
(({0: 1}, {0: "none"}), 1, 1, "none"),
Expand Down
29 changes: 0 additions & 29 deletions tests/test_dask_image/test_ndfourier/test__compat.py

This file was deleted.

0 comments on commit aab8434

Please sign in to comment.