Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

Already on GitHub? Sign in to your account

test_backend_pgf: TypeError #1843

Merged
merged 2 commits into from Mar 23, 2013

Conversation

Projects
None yet
5 participants
Contributor

mgiuca-google commented Mar 22, 2013

The tol parameter was missing in the call to compare_images().

test_backend_pgf: TypeError.
The tol parameter was missing in the call to compare_images().
Contributor

mgiuca-google commented Mar 22, 2013

This is a fix for Issue #1842.

The problem was that I deleted the tol parameter from all the calls to decorators.image_comparison (since it
defaults to 10 and we wanted to minimize the calls that did not use the default). However, I removed this one from compare_images which does not have a default tol.

In the spirit of the original patch, I have set tol to 10 (the default in image_comparison), rather than 50 which it was originally set to. Apparently, neither my local build, nor Travis, is running this test, so could somebody please figure out how to run it to make sure the tolerance of 10 is not too strict.

Owner

mdboom commented Mar 22, 2013

This works for me on my local machine

Contributor

ajdawson commented Mar 22, 2013

I'm getting two failed tests with this: matplotlib.tests.test_backend_pgf.test_rcupdate and matplotlib.tests.test_backend_pgf.test_pdflatex. In both cases the failed diffs indicate the differences are solely in text. I'm not sure if this means the tolerance should be changed, or if this is some perculiarity of my set up.

pgf_pdflatex_pdf-failed-diff

Contributor

ajdawson commented Mar 22, 2013

I fiddled with the tolerance parameter and found that for my system a tolerance of 14 was enough to remove both failures, in case that helps to put my previous comment into context...

mgiuca commented Mar 23, 2013

Thanks for testing again, Andrew.

We've* noticed that the text rendering can be a bit off on different systems, which is why there is tolerance at all. So I think we agreed to generally set the tolerances at 10 by default, and bump them up to whatever is necessary whenever someone has a "correct" render that fails the test (as opposed to something genuinely wrong). Good thing you attached the image; it looks like you have hit just this case, so I am bumping up the tolerance to 14. Can you test it again and make sure it passes?

*I say "we" not implying that I'm a maintainer of this library. I just contributed a rewrite of the comparison code, and these were my findings at the time.

Contributor

ajdawson commented Mar 23, 2013

This successfully removes the test failures for me.

Member

dmcdougall commented Mar 23, 2013

$ python tests.py matplotlib.tests.test_backend_pgf
...
----------------------------------------------------------------------
Ran 3 tests in 5.570s

OK

dmcdougall added a commit that referenced this pull request Mar 23, 2013

@dmcdougall dmcdougall merged commit 74918a4 into matplotlib:master Mar 23, 2013

1 check passed

default The Travis build passed
Details
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment