Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

MAINT: Unify calculation of normal vectors from polygons #12136

Merged
merged 2 commits into from
Dec 6, 2018

Conversation

eric-wieser
Copy link
Contributor

@eric-wieser eric-wieser commented Sep 16, 2018

This combines get_normals and _generate_normals, and eliminates all other calls to np.cross. This came up in #9990, but I wanted to postpone it to (this) later PR.

get_normals and _generate_normals were profiled, and it was found that vectorizing np.cross like in get_normals was faster:

import numpy as np

def get_normals(polygons):
    v1 = np.empty((len(polygons), 3))
    v2 = np.empty((len(polygons), 3))
    for poly_i, ps in enumerate(polygons):
        # pick three points around the polygon at which to find the
        # normal doesn't vectorize because polygons is jagged
        i1, i2, i3 = 0, len(ps)//3, 2*len(ps)//3
        v1[poly_i, :] = ps[i1, :] - ps[i2, :]
        v2[poly_i, :] = ps[i2, :] - ps[i3, :]
    return np.cross(v1, v2)

def _generate_normals(polygons):
    normals = []
    for verts in polygons:
        v1 = np.array(verts[0]) - np.array(verts[1])
        v2 = np.array(verts[2]) - np.array(verts[0])
        normals.append(np.cross(v1, v2))
    return np.array(normals)

def _generate_normals_new(polygons):
    if isinstance(polygons, np.ndarray):
        # optimization: polygons all have the same number of points, so can
        # vectorize
        n = polygons.shape[-2]
        i1, i2, i3 = 0, n//3, 2*n//3
        v1 = polygons[..., i1, :] - polygons[..., i2, :]
        v2 = polygons[..., i2, :] - polygons[..., i3, :]
    else:
        # The subtraction doesn't vectorize because polygons is jagged.
        v1 = np.empty((len(polygons), 3))
        v2 = np.empty((len(polygons), 3))
        for poly_i, ps in enumerate(polygons):
            n = len(ps)
            i1, i2, i3 = 0, n//3, 2*n//3
            v1[poly_i, :] = ps[i1, :] - ps[i2, :]
            v2[poly_i, :] = ps[i2, :] - ps[i3, :]
    return np.cross(v1, v2)

polygons_jagged = [
    np.random.rand(np.random.randint(10, 1000), 3)
    for i in range(100)
]

polygons_uniform = np.random.rand(100, 100, 3)
In [10]: %timeit _generate_normals(polygons_jagged)
8.43 ms ± 148 µs per loop (mean ± std. dev. of 7 runs, 100 loops each)

In [11]: %timeit get_normals(polygons_jagged)
583 µs ± 12.9 µs per loop (mean ± std. dev. of 7 runs, 1000 loops each)

In [12]: %timeit _generate_normals_new(polygons_jagged)
593 µs ± 26 µs per loop (mean ± std. dev. of 7 runs, 1000 loops each)


In [14]: %timeit _generate_normals(polygons_uniform)
8.45 ms ± 156 µs per loop (mean ± std. dev. of 7 runs, 100 loops each)

In [15]: %timeit get_normals(polygons_uniform)
599 µs ± 6.13 µs per loop (mean ± std. dev. of 7 runs, 1000 loops each)

In [13]: %timeit _generate_normals_new(polygons_uniform)
110 µs ± 10.1 µs per loop (mean ± std. dev. of 7 runs, 10000 loops each)

PR Checklist

Do not apply, since this is just a refactor:

  • Has Pytest style unit tests
  • New features are documented, with examples if plot related
  • Documentation is sphinx and numpydoc compliant
  • Added an entry to doc/users/next_whats_new/ if major new feature (follow instructions in README.rst there)
  • Documented in doc/api/api_changes.rst if API changed in a backward-incompatible way

@eric-wieser
Copy link
Contributor Author

eric-wieser commented Sep 16, 2018

Branch name hints that my goal here is to add shading to ax3d.voxels, in a later PR

@eric-wieser
Copy link
Contributor Author

An alternate implementation with less duplication, but with less clarity, could look like:

        def _inplane_vectors(polygons, v1_out=None, v2_out=None):
            n = polygons.shape[-2]
            i1, i2, i3 = 0, n//3, 2*n//3
            v1 = np.subtract(ps[..., i1, :], ps[..., i2, :], out=v1_out)
            v2 = np.subtract(ps[..., i2, :], ps[..., i3, :], out=v2_out)
            return v1, v2

        if isinstance(polygons, np.ndarray):
            # optimization: polygons all have the same number of points, so can
            # vectorize
            v1, v2 = _inplane_vectors(polygons)
        else:
            # The subtraction doesn't vectorize because polygons is jagged.
            v1 = np.empty((len(polygons), 3))
            v2 = np.empty((len(polygons), 3))
            for poly_i, ps in enumerate(polygons):
                _inplane_vectors(ps, v1_out=v1[poly_i, :], v2_out=v2[poly_i, :])
        return np.cross(v1, v2)

Let me know which is preferable

@eric-wieser
Copy link
Contributor Author

eric-wieser commented Sep 17, 2018

This fails because it changes the sense of the polygons. It seems that some functions use a right-hand-rule for the "front" face of a polygon, while others use a left-hand rule. Related: #12138

@eric-wieser
Copy link
Contributor Author

Just to give an update - this is now waiting on #12259

This combines `get_normals` and `_generate_normals`, and eliminates all other calls to np.cross.

`get_normals` and `_generate_normals` were profiled, and it was found that vectorizing `np.cross`  like in `get_normals` was faster:

```python
import numpy as np

def get_normals(polygons):
    v1 = np.empty((len(polygons), 3))
    v2 = np.empty((len(polygons), 3))
    for poly_i, ps in enumerate(polygons):
        # pick three points around the polygon at which to find the
        # normal doesn't vectorize because polygons is jagged
        i1, i2, i3 = 0, len(ps)//3, 2*len(ps)//3
        v1[poly_i, :] = ps[i1, :] - ps[i2, :]
        v2[poly_i, :] = ps[i2, :] - ps[i3, :]
    return np.cross(v1, v2)

def _generate_normals(self, polygons):
    normals = []
    for verts in polygons:
        v1 = np.array(verts[0]) - np.array(verts[1])
        v2 = np.array(verts[2]) - np.array(verts[0])
        normals.append(np.cross(v1, v2))
    return np.array(normals)

polygons = [
    np.random.rand(np.random.randint(10, 1000), 3)
    for i in range(100)
]

%timeit _generate_normals(polygons)
# 3.14 ms ± 255 µs per loop (mean ± std. dev. of 7 runs, 100 loops each)

%timeit get_normals(polygons)
# 452 µs ± 4.33 µs per loop (mean ± std. dev. of 7 runs, 1000 loops each)
```
@eric-wieser
Copy link
Contributor Author

Rebased and ready to go

Copy link
Member

@timhoffm timhoffm left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Looks good otherwise. Please rebase.


Parameters
----------
polygons: list of (M_i, 3) array_like, or (..., M, 3) array_like
Copy link
Member

@timhoffm timhoffm Nov 27, 2018

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Why the i index ?

Copy link
Contributor Author

@eric-wieser eric-wieser Nov 27, 2018

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Because the value of M can change between each list item - polygons[0] has shape (M_0, 3), polygons[1] has shape (M_1, 3), and it is likely that M_1 != M_0

@eric-wieser
Copy link
Contributor Author

@timhoffm: merged using the web UI, since it was doc-only. Feel free to squash to elide the merge commit

n = polygons.shape[-2]
i1, i2, i3 = 0, n//3, 2*n//3
v1 = polygons[..., i1, :] - polygons[..., i2, :]
v2 = polygons[..., i2, :] - polygons[..., i3, :]
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Is there the sign change here intended? This is the convention of get_normals. _generate_normals had it the other way round. I know that there have been some orientation issues. But I don‘t know the state. Just want to make sure the sign change is notbslipping in unintendedly.

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Sign change is only in v1 and v2, it cancels out in the cross product, so it doesn't affect the return value.

I picked the convention from get_normals because that was easiest. If you want me to flip both subtractions here, I can, but it won't make any difference.

@timhoffm timhoffm merged commit 3e07fd4 into matplotlib:master Dec 6, 2018
@QuLogic QuLogic added this to the v3.1 milestone Dec 6, 2018
Stephen-Chilcote pushed a commit to Stephen-Chilcote/matplotlib that referenced this pull request Dec 11, 2018
…12136)

This combines `get_normals` and `_generate_normals`, and eliminates all other calls to np.cross.

`get_normals` and `_generate_normals` were profiled, and it was found that vectorizing `np.cross`  like in `get_normals` was faster:

```python
import numpy as np

def get_normals(polygons):
    v1 = np.empty((len(polygons), 3))
    v2 = np.empty((len(polygons), 3))
    for poly_i, ps in enumerate(polygons):
        # pick three points around the polygon at which to find the
        # normal doesn't vectorize because polygons is jagged
        i1, i2, i3 = 0, len(ps)//3, 2*len(ps)//3
        v1[poly_i, :] = ps[i1, :] - ps[i2, :]
        v2[poly_i, :] = ps[i2, :] - ps[i3, :]
    return np.cross(v1, v2)

def _generate_normals(self, polygons):
    normals = []
    for verts in polygons:
        v1 = np.array(verts[0]) - np.array(verts[1])
        v2 = np.array(verts[2]) - np.array(verts[0])
        normals.append(np.cross(v1, v2))
    return np.array(normals)

polygons = [
    np.random.rand(np.random.randint(10, 1000), 3)
    for i in range(100)
]

%timeit _generate_normals(polygons)
# 3.14 ms ± 255 µs per loop (mean ± std. dev. of 7 runs, 100 loops each)

%timeit get_normals(polygons)
# 452 µs ± 4.33 µs per loop (mean ± std. dev. of 7 runs, 1000 loops each)
```
raamana added a commit to raamana/matplotlib that referenced this pull request Dec 13, 2018
* upstream/master: (1723 commits)
  Correctly get weight & style hints from certain newer Microsoft fonts (matplotlib#12945)
  Remove some checks for Py<3.6 in the test suite. (matplotlib#12974)
  Fail-fast when trying to run tests with too-old pytest.
  Include scatter plots in Qt figure options editor. (matplotlib#12779)
  ENH: replace deprecated numpy header
  Minor simplifications.
  tickminorvisible-fix (matplotlib#12938)
  Remove animated=True from animation docs
  Update the documentation of Cursor
  Misc. cleanups.
  Add test for 3d conversion of empty PolyCollection
  Support ~ as nonbreaking space in mathtext.
  Deprecate public use of Formatter.pprint_val.
  MAINT: Unify calculation of normal vectors from polygons (matplotlib#12136)
  Fix the title of testing_api
  More table documentation
  Simplify bachelors degree example using new features.
  Avoid pyplot in showcase examples.
  Simplify argument checking in Table.__getitem__. (matplotlib#12932)
  Minor updates following bump to Py3.6+.
  ...

# Conflicts:
#	lib/matplotlib/widgets.py
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

Successfully merging this pull request may close these issues.

5 participants