Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

test_light_source_shading_default and test_light_source_masked_shading fails with numpy 1.9 #3598

Closed
jenshnielsen opened this issue Sep 30, 2014 · 15 comments · Fixed by #3755
Milestone

Comments

@jenshnielsen
Copy link
Member

These tests are part of #3291 and fail for me with numpy 1.9.0

The tests works with numpy 1.8.2 (on travis and locally). Not sure what is wrong here.
Apologies for not spotting this before the merge.

@tacaswell
Copy link
Member

To quote @mdboom quoting one of his bosses, "if you are not breaking the build you aren't writing enough code".

It is unreasonable to expect you to do a more through build matrix than we do on travis. This probably means we should test at least one version of python (I advocate for 2.7 and 3.4) with np1.9 on travis.

@tacaswell tacaswell added this to the v1.4.x milestone Sep 30, 2014
@jenshnielsen
Copy link
Member Author

Yes we probably should. I don't fully understand why pip doesn't install 1.9 on Travis but the output confirms it. Perhaps it is related to the use of --use-mirrors on travis?

@jenshnielsen
Copy link
Member Author

This is a difference on the edges likely caused by the changes to np.gradient http://mail.scipy.org/pipermail/numpy-discussion/2014-October/071334.html

@jenshnielsen
Copy link
Member Author

Since there is no flag in numpy to restore the old behavior I think the right solution is to exclude the edges in the comparison. I will make a pr

@jenshnielsen
Copy link
Member Author

On a closer inspection it looks like there might be a real bug here. The fix works for the non masked test but the masked one is clearly different.

The output of running the test code with numpy 1.9 is
numpy19

Which is quite different from the expected:
expect

@tacaswell
Copy link
Member

Can you make those with interpolation='none' ? I bet (hope) the interpolation is making it look worse than it is.

@jenshnielsen
Copy link
Member Author

Yes you are right, I actually did that afterwards locally yesterday but I was on an unstable internet connection so I did not upload them.

@jenshnielsen
Copy link
Member Author

numpy19shadebug

The main differences are on the edge but the rest are different enough to make the test fail too.

@jenshnielsen
Copy link
Member Author

I can confirm that replacing numpy gradient with the version from 1.8.2 fixes the issue. (at https://github.com/jenshnielsen/matplotlib/tree/numpy_gradient if anyone should care) but that
is not a solution that I think we should go for.

@tacaswell
Copy link
Member

No, we should not start shipping old versions of numpy functions.

In the short term, I am inclined to known-fail one or both of those tests with numpy 1.9 while we get this sorted out so that the tests an master are useful again.

@tacaswell tacaswell modified the milestones: v1.5.x, v1.4.x Oct 13, 2014
@jenshnielsen
Copy link
Member Author

Yes I think we should do that to get the tests passing again on Travis.

It was never my intention to suggest that we ship that function. The intention was just to verify that the difference is limited to the gradient function.

@jenshnielsen
Copy link
Member Author

The fundamental reason why this looks so badly is that a lot more elements are makes when the gradient is calculated from a masked array with the new algorithm.

@tacaswell
Copy link
Member

See http://thread.gmane.org/gmane.comp.python.numeric.general/58971/focus=58975.

Noted in np thread, linking here for convenience

@joferkington
Copy link
Contributor

Just as a vote in favor of shipping a simplified gradient function (I realize this is a terrible idea in 99.99% of cases!!):

All of the functionality of numpy.gradient that we need amounts to:

def gradient(z, dx=1, dy=1):
    dx_out, dy_out = np.empty_like(z), np.empty_like(z)
    dy_out[1:-1, :] = (z[2:, :] - z[:-2, :]) / 2.0
    dx_out[:, 1:-1] = (z[:, 2:] - z[:, :-2]) / 2.0

    dy_out[0, :] = z[1, :] - z[0, :]
    dy_out[-1, :] = z[-1, :] - z[-2, :]

    dx_out[:, 0] = z[:, 1] - z[:, 0]
    dx_out[:, -1] = z[:, -1] - z[:, -2]
    return dy_out / dy, dx_out / dx

Of course, it's probably still a terrible idea, but perhaps not quite as bad of an idea as it might seem at first glance...

@jenshnielsen
Copy link
Member Author

Lets just hold out for the 1.9.1 release of numpy before we decide on this.

There are actually to separate issues in numpy. The changes to gradient on the edge in numpy are fixed by numpy/numpy#5203. The differences on the edge can easily be fixed in the test by not comparing the edge sites.

The changes to gradient revealed another issues in numpy where the masks of masked arrays were linked (fixed by numpy/numpy#5184) This resulted in strange modification of the input arrays. This may happen to if we roll out our own gradient implementation too.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging a pull request may close this issue.

3 participants