Skip to content

HTTPS clone URL

Subversion checkout URL

You can clone with
or
.
Download ZIP

Loading…

test_backend_pgf: TypeError #1843

Merged
merged 2 commits into from

5 participants

@mgiuca-google

The tol parameter was missing in the call to compare_images().

@mgiuca-google mgiuca-google test_backend_pgf: TypeError.
The tol parameter was missing in the call to compare_images().
a92e7b3
@mgiuca-google

This is a fix for Issue #1842.

The problem was that I deleted the tol parameter from all the calls to decorators.image_comparison (since it
defaults to 10 and we wanted to minimize the calls that did not use the default). However, I removed this one from compare_images which does not have a default tol.

In the spirit of the original patch, I have set tol to 10 (the default in image_comparison), rather than 50 which it was originally set to. Apparently, neither my local build, nor Travis, is running this test, so could somebody please figure out how to run it to make sure the tolerance of 10 is not too strict.

@mdboom
Owner

This works for me on my local machine

@ajdawson

I'm getting two failed tests with this: matplotlib.tests.test_backend_pgf.test_rcupdate and matplotlib.tests.test_backend_pgf.test_pdflatex. In both cases the failed diffs indicate the differences are solely in text. I'm not sure if this means the tolerance should be changed, or if this is some perculiarity of my set up.

pgf_pdflatex_pdf-failed-diff

@ajdawson

I fiddled with the tolerance parameter and found that for my system a tolerance of 14 was enough to remove both failures, in case that helps to put my previous comment into context...

@mgiuca

Thanks for testing again, Andrew.

We've* noticed that the text rendering can be a bit off on different systems, which is why there is tolerance at all. So I think we agreed to generally set the tolerances at 10 by default, and bump them up to whatever is necessary whenever someone has a "correct" render that fails the test (as opposed to something genuinely wrong). Good thing you attached the image; it looks like you have hit just this case, so I am bumping up the tolerance to 14. Can you test it again and make sure it passes?

*I say "we" not implying that I'm a maintainer of this library. I just contributed a rewrite of the comparison code, and these were my findings at the time.

@ajdawson

This successfully removes the test failures for me.

@dmcdougall
Collaborator
$ python tests.py matplotlib.tests.test_backend_pgf
...
----------------------------------------------------------------------
Ran 3 tests in 5.570s

OK
@dmcdougall dmcdougall merged commit 74918a4 into matplotlib:master

1 check passed

Details default The Travis build passed
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Commits on Mar 22, 2013
  1. @mgiuca-google

    test_backend_pgf: TypeError.

    mgiuca-google authored
    The tol parameter was missing in the call to compare_images().
Commits on Mar 23, 2013
  1. @mgiuca-google
This page is out of date. Refresh to see the latest.
Showing with 1 addition and 1 deletion.
  1. +1 −1  lib/matplotlib/tests/test_backend_pgf.py
View
2  lib/matplotlib/tests/test_backend_pgf.py
@@ -57,7 +57,7 @@ def compare_figure(fname):
expected = os.path.join(result_dir, "expected_%s" % fname)
shutil.copyfile(os.path.join(baseline_dir, fname), expected)
- err = compare_images(expected, actual)
+ err = compare_images(expected, actual, tol=14)
if err:
raise ImageComparisonFailure('images not close: %s vs. %s' % (actual, expected))
Something went wrong with that request. Please try again.