Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[2dcontext] 2d gradient tests need a fuzz factor #21647

Closed
wants to merge 1 commit into from
Closed

[2dcontext] 2d gradient tests need a fuzz factor #21647

wants to merge 1 commit into from

Conversation

grorg
Copy link
Contributor

@grorg grorg commented Feb 7, 2020

Some implementations, in particular WebKit on macOS/iOS, add
some dithering with noise to their gradients in order to provide
a nicer visual output. This means you can't test for exact pixel
values in the resulting image.

Instead you should check the pixel values with a small tolerance.
I'd already done this for some gradient tests, but not all.

The actual fix here is to use _assertPixelApprox instead of
_assertPixel. Rather than specify a tolerance at each call site,
I changed the function definition to have a default tolerance value
that seemed appropriate.

@annevk
Copy link
Member

annevk commented Feb 7, 2020

It seems like you are changing generated files, that cannot be right?

@grorg
Copy link
Contributor Author

grorg commented Feb 7, 2020

OOPS!

@grorg
Copy link
Contributor Author

grorg commented Feb 7, 2020

Updated now.

I'm still providing new versions of the generated files, because it is not clear when they are generated. This also touches some existing files because I removed the need to specify a tolerance value if you're using the default.

Copy link
Member

@annevk annevk left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

So this is defaulting tolerance to 3 and changing some asserts from == to ==~. That seems fine, though I'm not sure I'm the correct person to judge. The other gentestutils.py changes seem suspect though.

Are equivalent changes need to offscreen-canvas?

# cr.set_source_rgba(r, g, b, a)
# cr.rectangle(0, 0, w, h)
# cr.fill()
# surface.write_to_png('%s/%s' % (IMAGEOUTPUTDIR, filename))
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This doesn't seem correct.

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Indeed. This is a mistake (brought about by #19780)

@annevk
Copy link
Member

annevk commented Feb 10, 2020

As for generating, I think you are supposed to generate them in the same patch. They are not generated by the build system at the moment.

@jgraham
Copy link
Contributor

jgraham commented Feb 10, 2020

There just isn't a build system for tests. The only thing that gets built is the manifest and the overhead of having to do that is already rather high. So tests that happen to be generated from data have to have their final generated form checked in.

@grorg
Copy link
Contributor Author

grorg commented Feb 15, 2020

Updated patch. I removed the mistakenly commented out code.

I've also reduced the tolerance back to 2, which is the value most often used through the code. Hopefully that is enough to not get caught up by our dithering.

@grorg
Copy link
Contributor Author

grorg commented Feb 15, 2020

offscreen canvas does probably need similar changes

Copy link
Member

@annevk annevk left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

So this looks okay to me at this point, but I'm not sure I'm the right reviewer.

@grorg
Copy link
Contributor Author

grorg commented Feb 17, 2020

I'm not sure what is required before merging. The Community-TC tasks have:

  • wpt-firefox-nightly-stability failing a bunch of OffscreenCanvas tests because OffscreenCanvas is not defined. (I'm fairly sure this patch isn't at fault... although maybe?)

  • the wpt-chrome-dev-stability tests are failing nearly everything to do with measureText (unrelated)

  • the lint tests are failing in webaudio/resources/audit.js

  • the update-built job produces a lot of output, with a bunch of warnings that I believe are expected, and a diff at the end which looks correct. I'm not sure how to read the results, or why it thinks it is failing.

@jgraham
Copy link
Contributor

jgraham commented Feb 17, 2020

The stability failures are just that lots of tests are affected so the job's timing out. These aren't worrying. The lint failure seems to be breakage that landed on master but is now fixed; that will go away when CI reruns.

The one thing that needs to be fixed here is the built-tests thing; you need to check-in the updated generated tests (wpt doesn't have a build system as such so all the generated tests need to be in the repo). To regenerate the files, I believe you run ./build.sh in tools/2dcontext.

@annevk
Copy link
Member

annevk commented Feb 18, 2020

But that has been done, no? There are lots of changed generated tests in line what you'd expect with .yaml changes.

@grorg
Copy link
Contributor Author

grorg commented Feb 18, 2020

Right. The patch does include all the updates to the generated files. I'll push it again for luck.

@grorg
Copy link
Contributor Author

grorg commented Feb 18, 2020

@jgraham it failed again, and I'm missing why.

The commit includes all the changes to the generated files. The "built" step seems to run the tools and produce a diff.

@stephenmcgruer
Copy link
Contributor

stephenmcgruer commented Feb 18, 2020

Patching in your change and running locally, it looks like you maybe missed a bunch of files? Running git diff after running ./update-built-tests.sh (again, with your change patched in) show diffs in 172 offscreen-canvas/ tests (and some pngs, but I think those changes may be bogus).

@grorg
Copy link
Contributor Author

grorg commented Feb 18, 2020

Thanks @stephenmcgruer.

That was the problem. I was running build in just the 2dcontext directory. It wasn't clear to me that the yaml file in 2dcontext also produces the tests in offscreen-canvas.

Some implementations, in particular WebKit on macOS/iOS, add
some dithering with noise to their gradients in order to provide
a nicer visual output. This means you can't test for exact pixel
values in the resulting image.

Instead you should check the pixel values with a small tolerance.
I'd already done this for some gradient tests, but not all.

The actual fix here is to use _assertPixelApprox instead of
_assertPixel. Rather than specify a tolerance at each call site,
I changed the function definition to have a default tolerance value.
@grorg
Copy link
Contributor Author

grorg commented Feb 18, 2020

So now it is at a point where the Chrome and Firefox bots fail. I'll have to look to see if that was caused by this change (although it is weakening accuracy, so it shouldn't cause failures).

@stephenmcgruer
Copy link
Contributor

The stability check failures are just timeouts due to the number of tests involved (known issue). The wpt.fyi results look good for Chrome and Firefox. Safari reports one regression, but I think looking at it that it is a flaky test.

So, I'm happy to admin-merge this, but would like to understand why @annevk thought they weren't the right reviewer (do you feel we need someone relating to the gradient spec to review this?).

@annevk
Copy link
Member

annevk commented Feb 19, 2020

I "know" canvas, but I'm not up-to-date with regards to what extent we've given up on pixel perfect accuracy for various features across engines.

@tabatkins
Copy link
Contributor

Btw, I've opened an issue on CSSWG about explicitly allowing dithering in gradients w3c/csswg-drafts#4793; currently the spec disallows it if read strictly.

@stephenmcgruer
Copy link
Contributor

Thanks Tab! @grorg do you mind if this PR waits until CSSWG gets a chance to discuss it? (I'm happy for this to be merged once the CSSWG have said they're happy, even before the spec change.)

@grorg grorg closed this by deleting the head repository Jan 19, 2023
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

Successfully merging this pull request may close these issues.

7 participants