-
Notifications
You must be signed in to change notification settings - Fork 3.1k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
[2dcontext] 2d gradient tests need a fuzz factor #21647
[2dcontext] 2d gradient tests need a fuzz factor #21647
Conversation
It seems like you are changing generated files, that cannot be right? |
OOPS! |
Updated now. I'm still providing new versions of the generated files, because it is not clear when they are generated. This also touches some existing files because I removed the need to specify a tolerance value if you're using the default. |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
So this is defaulting tolerance to 3 and changing some asserts from == to ==~. That seems fine, though I'm not sure I'm the correct person to judge. The other gentestutils.py changes seem suspect though.
Are equivalent changes need to offscreen-canvas?
2dcontext/tools/gentestutils.py
Outdated
# cr.set_source_rgba(r, g, b, a) | ||
# cr.rectangle(0, 0, w, h) | ||
# cr.fill() | ||
# surface.write_to_png('%s/%s' % (IMAGEOUTPUTDIR, filename)) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
This doesn't seem correct.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Indeed. This is a mistake (brought about by #19780)
As for generating, I think you are supposed to generate them in the same patch. They are not generated by the build system at the moment. |
There just isn't a build system for tests. The only thing that gets built is the manifest and the overhead of having to do that is already rather high. So tests that happen to be generated from data have to have their final generated form checked in. |
Updated patch. I removed the mistakenly commented out code. I've also reduced the tolerance back to 2, which is the value most often used through the code. Hopefully that is enough to not get caught up by our dithering. |
offscreen canvas does probably need similar changes |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
So this looks okay to me at this point, but I'm not sure I'm the right reviewer.
I'm not sure what is required before merging. The Community-TC tasks have:
|
The stability failures are just that lots of tests are affected so the job's timing out. These aren't worrying. The lint failure seems to be breakage that landed on master but is now fixed; that will go away when CI reruns. The one thing that needs to be fixed here is the built-tests thing; you need to check-in the updated generated tests (wpt doesn't have a build system as such so all the generated tests need to be in the repo). To regenerate the files, I believe you run |
But that has been done, no? There are lots of changed generated tests in line what you'd expect with .yaml changes. |
Right. The patch does include all the updates to the generated files. I'll push it again for luck. |
@jgraham it failed again, and I'm missing why. The commit includes all the changes to the generated files. The "built" step seems to run the tools and produce a diff. |
Patching in your change and running locally, it looks like you maybe missed a bunch of files? Running |
Thanks @stephenmcgruer. That was the problem. I was running |
Some implementations, in particular WebKit on macOS/iOS, add some dithering with noise to their gradients in order to provide a nicer visual output. This means you can't test for exact pixel values in the resulting image. Instead you should check the pixel values with a small tolerance. I'd already done this for some gradient tests, but not all. The actual fix here is to use _assertPixelApprox instead of _assertPixel. Rather than specify a tolerance at each call site, I changed the function definition to have a default tolerance value.
So now it is at a point where the Chrome and Firefox bots fail. I'll have to look to see if that was caused by this change (although it is weakening accuracy, so it shouldn't cause failures). |
The stability check failures are just timeouts due to the number of tests involved (known issue). The wpt.fyi results look good for Chrome and Firefox. Safari reports one regression, but I think looking at it that it is a flaky test. So, I'm happy to admin-merge this, but would like to understand why @annevk thought they weren't the right reviewer (do you feel we need someone relating to the gradient spec to review this?). |
I "know" canvas, but I'm not up-to-date with regards to what extent we've given up on pixel perfect accuracy for various features across engines. |
Btw, I've opened an issue on CSSWG about explicitly allowing dithering in gradients w3c/csswg-drafts#4793; currently the spec disallows it if read strictly. |
Thanks Tab! @grorg do you mind if this PR waits until CSSWG gets a chance to discuss it? (I'm happy for this to be merged once the CSSWG have said they're happy, even before the spec change.) |
Some implementations, in particular WebKit on macOS/iOS, add
some dithering with noise to their gradients in order to provide
a nicer visual output. This means you can't test for exact pixel
values in the resulting image.
Instead you should check the pixel values with a small tolerance.
I'd already done this for some gradient tests, but not all.
The actual fix here is to use
_assertPixelApprox
instead of_assertPixel
. Rather than specify a tolerance at each call site,I changed the function definition to have a default tolerance value
that seemed appropriate.