Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Properly trigger pytest matplotlib image comparison #352

Merged
merged 5 commits into from
Oct 29, 2019

Conversation

weiji14
Copy link
Member

@weiji14 weiji14 commented Oct 27, 2019

Description of proposed changes

All the while, we weren't actually testing the image comparisons! I've actually had a hunch a long while back, but it was the change to the SRTM15+V2 grids since #350 that made me think things weren't right. There's also a bigger degree ° symbol now on the x and y axis for some of the plots using basemap.

baseline-test_grdimage test_grdimage test_grdimage-failed-diff
Old New Diff

This Pull Request adds the --mpl flag and saves any different/failed results via the --mpl-results-path=results which will store the diff images under $TESTDIR/results. See upstream pytest-mpl docs.

I'll do this the Test-Driven Development (TDD) style, make the tests fail first (so we have a record), and then fix the images to make the tests pass. The failures come mostly from grdimage and grdcontour, but also makecpt and a couple of other random ones...

TODO:

  • Trigger test failures (30 actual failures in total, ~35 images to change below).
  • Fix grdcontour(5), grdimage (3), logo (2).
  • Fix coast (4), image (1), makecpt (8)
  • Fix basemap (4) and plot (8).

Fixes #

Reminders

  • Run make format and make check to make sure the code follows the style guide.
  • Add tests for new features or tests that would have caught the bug that you're fixing.
  • Add new public functions/methods/classes to doc/api/index.rst.
  • Write detailed docstrings for all functions/methods.
  • If adding new functionality, add an example to docstrings or tutorials.

All the while, we weren't actually testing the image comparisons! Adding the `--mpl` flag and saving the failed results under $TESTDIR/results as per https://github.com/matplotlib/pytest-mpl/blob/954073d458ab4d96796c67d6aeb994d17ee39817/README.rst#using.
@weiji14 weiji14 added the bug Something isn't working label Oct 27, 2019
@weiji14 weiji14 added this to the 0.1.0 milestone Oct 27, 2019
@weiji14 weiji14 self-assigned this Oct 27, 2019
Update baseline images for grdcontour (5), grdimage (3) and logo (2).
Update baseline images for coast (4), image (1) and makecpt (9).
Corrected one of the basemap tests that had plotted 'Depth' instead of 'Crustal Age' on the power x-axis label. Also updated baseline images accordingly for basemap (4) and plot (8).
@weiji14 weiji14 marked this pull request as ready for review October 27, 2019 21:06
Copy link
Member Author

@weiji14 weiji14 left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Ready for review! Mainly changes because of the new SRTM15+V2 grids, and some that have a bigger lat/lon degree sign on the axes.

pygmt/tests/test_basemap.py Outdated Show resolved Hide resolved
@weiji14 weiji14 requested a review from a team October 27, 2019 21:11
Copy link
Member

@seisman seisman left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Looks good to me.

@weiji14 weiji14 merged commit 882dbba into GenericMappingTools:master Oct 29, 2019
@weiji14 weiji14 deleted the test-mpl-properly branch October 29, 2019 02:25
weiji14 referenced this pull request in weiji14/pygmt Nov 7, 2019
Refresh baseline plots to use SRTM15+V2 grids as per #350, and also some fixed some filenames after being caught by pytest's matplotlib image comparison (thanks to #352!).
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug Something isn't working
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants