test_backend_pgf: TypeError #1843

Merged
merged 2 commits into from Mar 23, 2013

Conversation

Projects
None yet
5 participants
@mgiuca-google
Contributor

mgiuca-google commented Mar 22, 2013

The tol parameter was missing in the call to compare_images().

test_backend_pgf: TypeError.
The tol parameter was missing in the call to compare_images().
@mgiuca-google

This comment has been minimized.

Show comment Hide comment
@mgiuca-google

mgiuca-google Mar 22, 2013

Contributor

This is a fix for Issue #1842.

The problem was that I deleted the tol parameter from all the calls to decorators.image_comparison (since it
defaults to 10 and we wanted to minimize the calls that did not use the default). However, I removed this one from compare_images which does not have a default tol.

In the spirit of the original patch, I have set tol to 10 (the default in image_comparison), rather than 50 which it was originally set to. Apparently, neither my local build, nor Travis, is running this test, so could somebody please figure out how to run it to make sure the tolerance of 10 is not too strict.

Contributor

mgiuca-google commented Mar 22, 2013

This is a fix for Issue #1842.

The problem was that I deleted the tol parameter from all the calls to decorators.image_comparison (since it
defaults to 10 and we wanted to minimize the calls that did not use the default). However, I removed this one from compare_images which does not have a default tol.

In the spirit of the original patch, I have set tol to 10 (the default in image_comparison), rather than 50 which it was originally set to. Apparently, neither my local build, nor Travis, is running this test, so could somebody please figure out how to run it to make sure the tolerance of 10 is not too strict.

@mdboom

This comment has been minimized.

Show comment Hide comment
@mdboom

mdboom Mar 22, 2013

Member

This works for me on my local machine

Member

mdboom commented Mar 22, 2013

This works for me on my local machine

@ajdawson

This comment has been minimized.

Show comment Hide comment
@ajdawson

ajdawson Mar 22, 2013

Contributor

I'm getting two failed tests with this: matplotlib.tests.test_backend_pgf.test_rcupdate and matplotlib.tests.test_backend_pgf.test_pdflatex. In both cases the failed diffs indicate the differences are solely in text. I'm not sure if this means the tolerance should be changed, or if this is some perculiarity of my set up.

pgf_pdflatex_pdf-failed-diff

Contributor

ajdawson commented Mar 22, 2013

I'm getting two failed tests with this: matplotlib.tests.test_backend_pgf.test_rcupdate and matplotlib.tests.test_backend_pgf.test_pdflatex. In both cases the failed diffs indicate the differences are solely in text. I'm not sure if this means the tolerance should be changed, or if this is some perculiarity of my set up.

pgf_pdflatex_pdf-failed-diff

@ajdawson

This comment has been minimized.

Show comment Hide comment
@ajdawson

ajdawson Mar 22, 2013

Contributor

I fiddled with the tolerance parameter and found that for my system a tolerance of 14 was enough to remove both failures, in case that helps to put my previous comment into context...

Contributor

ajdawson commented Mar 22, 2013

I fiddled with the tolerance parameter and found that for my system a tolerance of 14 was enough to remove both failures, in case that helps to put my previous comment into context...

@mgiuca

This comment has been minimized.

Show comment Hide comment
@mgiuca

mgiuca Mar 23, 2013

Thanks for testing again, Andrew.

We've* noticed that the text rendering can be a bit off on different systems, which is why there is tolerance at all. So I think we agreed to generally set the tolerances at 10 by default, and bump them up to whatever is necessary whenever someone has a "correct" render that fails the test (as opposed to something genuinely wrong). Good thing you attached the image; it looks like you have hit just this case, so I am bumping up the tolerance to 14. Can you test it again and make sure it passes?

*I say "we" not implying that I'm a maintainer of this library. I just contributed a rewrite of the comparison code, and these were my findings at the time.

mgiuca commented Mar 23, 2013

Thanks for testing again, Andrew.

We've* noticed that the text rendering can be a bit off on different systems, which is why there is tolerance at all. So I think we agreed to generally set the tolerances at 10 by default, and bump them up to whatever is necessary whenever someone has a "correct" render that fails the test (as opposed to something genuinely wrong). Good thing you attached the image; it looks like you have hit just this case, so I am bumping up the tolerance to 14. Can you test it again and make sure it passes?

*I say "we" not implying that I'm a maintainer of this library. I just contributed a rewrite of the comparison code, and these were my findings at the time.

@ajdawson

This comment has been minimized.

Show comment Hide comment
@ajdawson

ajdawson Mar 23, 2013

Contributor

This successfully removes the test failures for me.

Contributor

ajdawson commented Mar 23, 2013

This successfully removes the test failures for me.

@dmcdougall

This comment has been minimized.

Show comment Hide comment
@dmcdougall

dmcdougall Mar 23, 2013

Member
$ python tests.py matplotlib.tests.test_backend_pgf
...
----------------------------------------------------------------------
Ran 3 tests in 5.570s

OK
Member

dmcdougall commented Mar 23, 2013

$ python tests.py matplotlib.tests.test_backend_pgf
...
----------------------------------------------------------------------
Ran 3 tests in 5.570s

OK

dmcdougall added a commit that referenced this pull request Mar 23, 2013

@dmcdougall dmcdougall merged commit 74918a4 into matplotlib:master Mar 23, 2013

1 check passed

default The Travis build passed
Details
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment