Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Convert SVG 1.1 marker tests to reftests. #4015

Closed

Conversation

nikosandronikos
Copy link
Member

@nikosandronikos nikosandronikos commented Oct 18, 2016

Convert SVG 1.1 marker tests to reftests.

@wpt-pr-bot
Copy link
Collaborator

Reviewers for this pull request are: @Ms2ger and @heycam.

@sideshowbarker
Copy link
Contributor

sideshowbarker commented Oct 19, 2016

@nikosandronikos see https://travis-ci.org/w3c/web-platform-tests/builds/168782053 —several of the files in this PR have trailing whitespace; e.g., line 5 of svg/painting/marker-001.svg. All the trailing whitespace needs to be zapped to make the linter happy.

@nikosandronikos
Copy link
Member Author

@sideshowbarker No problem - any idea why the linter might not report the whitespace when run locally?

@nikosandronikos
Copy link
Member Author

@sideshowbarker No problem - any idea why the linter might not report the whitespace when run locally?

Ok, user error I think - I have to pass the full file rather than a path. I was assuming passing a path caused all files in that path to be checked.

@sideshowbarker
Copy link
Contributor

any idea why the linter might not report the whitespace when run locally?

Ok, user error I think - I have to pass the full file rather than a path. I was assuming passing a path caused all files in that path to be checked.

Yeah, sorry, we should document that better. I think most people would reasonably expect that if you pass it a directory name as a path it will recursively check all the files in the directory.

FWIW the way I personally work around it for now is just to do something like:

find svg | xargs ./lint

…which is relatively fast

@sideshowbarker
Copy link
Contributor

By the way I guess you already noticed (I did only just now) that along with flagging the trailing whitespace, the linter is also flagged svg/painting/marker-005.svg as PARSE-FAILED.

That seems to be caused by the U+001c character in line 89 of that file. Which I guess is an illegal character in XML documents?

@nikosandronikos
Copy link
Member Author

@sideshowbarker I did see that. Thanks!

@sideshowbarker
Copy link
Contributor

w3c-test:mirror

Copy link
Contributor

@sideshowbarker sideshowbarker left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I looked over all these tests and they all seem right as far as the conversion to reftests goes. I did not review them against the SVG spec, but I assume (hope) the originals went through review already somewhere? If so then I can go ahead and merge them to master. Otherwise do you have another SVG subject matter expert who can review them?

@@ -0,0 +1,23 @@
<svg width="100%" height="100%" viewBox="0 0 480 360"
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I notice that in Firefox this svg/painting/marker-007.svg test displays nothing. Is that an expected failure? Gecko doesn’t conform to the spec on this?

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Yes, Gecko doesn't seem to conform to the spec on that one. I can't find a bug in bugzilla, but I'll raise one once the test is reviewed and lands.
Here's the SVG 1.1 SE test suite link for reference: https://www.w3.org/Graphics/SVG/Test/20110816/harness/htmlObjectApproved/painting-marker-07-f.html

@nikosandronikos
Copy link
Member Author

I looked over all these tests and they all seem right as far as the conversion to reftests goes. I did not review them against the SVG spec, but I assume (hope) the originals went through review already somewhere? If so then I can go ahead and merge them to master. Otherwise do you have another SVG subject matter expert who can review them?

All the tests except 008 were previously reviewed as SVG 1.1 SE tests (008 was a draft still so unreviewed). But we should be reviewing that these tests still assert what is described in SVG 2.

As we create the SVG 2 test suite, it will likely be myself, @AmeliaBR, and @Tavmjong who will do most of the reviewing, so I'll try to get one of them to take a look. And it would be good to give them (and me) access to merging PRs at some point.

@AmeliaBR
Copy link
Contributor

Review for accuracy per Spec:

  • marker-001.svg Nothing controversial here & nothing changed

  • marker-002.svg A little more complicated (IE fails this test) but all to spec & nothing new. One concern, though: I'm noticing visible differences in anti-aliasing, in multiple user agents, between the marker test and the reference for the marker with stroke. It's slight, but probably enough to make automated tests flaky. The issue is that the stroke extends outside the viewBox of the marker, and so it being clipped by hidden overflow. The reference re-creates the effect with a clip-path, but there are also differences in coordinate systems that seem to be creating rounding differences. It may be possible to get a closer match by using nested <svg>, with viewBox and hidden overflow, instead of transforms and clip-path.

  • marker-003.svg and marker-004.svgThese are rather complex-looking tests for what they are testing (that the shorthand property works), but all looks okay.

    Aside: Is there a way to note test dependencies in the WPT metadata? E.g., marker-003-ref.svg uses the basic marker presentation attributes as a reference for the CSS test, so it is dependent on basic marker support (i.e., marker-001.svg passing). A user agent that doesn't support markers at all would get a false positive here.

  • marker-005.svg I had to look up this one to see whether we'd changed it. _The reference SVG is wrong for overflow: auto. It should match visible._ @nikosandronikos can you make the change?

    Also, I'm again seeing slight anti-aliasing differences between the test and reference files. In this case, the clipping effect has been applied by actually modifying the underlying paths to match the clipped shape. In this case, I'd be a bit worried about replacing it with <svg> with hidden vs visible overflow, in fear of false positives (e.g., visible overflow being broken for both <svg> and <marker>)

  • marker-006.svg The third sub-test in this file (the auto-orientation one) seems to be identical to the equivalent sub-test from marker-001.svg It isn't using offset viewBox values, which this file is supposed to be testing. Not sure whether that's a problem from the SVG 1.1 test suite copied over, but either way it would be nice to make it actually test something new.

  • marker-007.svg This one is correct & to spec in both SVG 1.1 and SVG 2, despite failing in both Edge (partly) and Firefox (completely).

  • marker-008.svg The comment with the test description is no longer correct in SVG 2. (These are not the only markable elements). The test is correct, for what it tests. Anti-aliasing discrepancies again.

I'd be okay approving the changes with the one correction to marker-005.svg. However, maybe some browser devs could run these through their testing systems to see if the anti-aliasing differences are causing false failures. If so, some fussing may be required.

And of course, there are new marker features that we need to add tests for. And lots of old edge cases that these don't cover.

@nikosandronikos
Copy link
Member Author

Hey @AmeliaBR, thanks for the awesome review.

marker-005.svg is easily fixed.

I've been playing with trying to totally eliminate the anti-aliasing issues and haven't found any great solutions. Everything has some level of difference around the edges. Since this is likely to pop up in lots of our tests, I'd like to work out some sort of recommendations we can give. But it's a bit dependent on how much the differences actually matter.
I'll upload a new version of marker-005.svg tomorrow, and run the tests through WebKit to see whether they pass or fail, then maybe try to get some input from the other browsers.

Regarding your other question - I don't know of any way to mark that test result accuracy is dependent on another test passing. Maybe @gsnedders has some suggestion here? It's probably not the end of the world if a false positive occurs as long as the test it's dependent on fails.

@gsnedders
Copy link
Member

w3c-test:mirror

@gsnedders
Copy link
Member

gsnedders commented Nov 17, 2016

@nikosandronikos Depending on another test passing is practically inevitable (after all, at some level, all testes rely on some other test passing, even if it is just basic functionality like the color property working), so I don't think we've ever really worried about it beyond trying to minimise the complexity of references such that they're unlikely to bogusly fail.

As for anti-aliasing differences, we badly need to come up with a better solution for fuzzy matching, because just assuming near identical shapes render identically doesn't quite work.

That said, at least in the marker-002 case I'm not sure this is anti-aliasing differences: I think it's subpixel rendering differences. In the bottom left, we start off with a scale(4), which is fine, but then we have a scale(0.2)—and 0.2 cannot be expressed exactly—so I think we end up with 20px being 17px or something like this. I'd try getting rid of all the scales and see if it still renders differently.

@nikosandronikos
Copy link
Member Author

@gsnedders anti-aliasing is just a way of representing sub-pixel coverage. In the actual test there's multiple levels of transformation happening. We have a viewBox, which defines a transform, markerWidth multiplying in another transformation, etc. So a reference with no transformations still ends up slightly different. We could massage the numbers until we get a set that gives us the least possible loss of precision, but that's quite hard, and it's going to be an ongoing problem.

@nikosandronikos
Copy link
Member Author

@AmeliaBR I have resolved anti aliasing differences (AFAICT) and addressed all your comments except the one for marker-006.svg - The test is different to marker-001.svg in that it has a viewBox. Could you double check your comment please? Or maybe I'm just missing something.

@nikosandronikos
Copy link
Member Author

ping @AmeliaBR in case these are still useful.

@svgeesus svgeesus requested review from svgeesus and removed request for boggydigital July 31, 2018 14:51
@svgeesus
Copy link
Contributor

@sideshowbarker I can't add myself as reviewer, can you help?

@sideshowbarker
Copy link
Contributor

@sideshowbarker I can't add myself as reviewer, can you help?

@svgeesus it seems you’re now listed as a reviewer for this PR, and I also see you listed as a reviewer in https://github.com/web-platform-tests/wpt/blob/master/svg/META.yml (which triggers related tooling and notifications stuff)

So I’m not sure what other buttons to push… Are you still running into to some problem?

Copy link
Contributor

@svgeesus svgeesus left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

LGTM now other reviewer comments have been addressed.
Wanting to merge because original submitter is no longer active, so any future changes are more easily done by others post-merge.

@svgeesus
Copy link
Contributor

@sideshowbarker the remaining issues are

code-review/reviewable — Repo disconnected, unable to update review status

and four conflicting files (resolve conflicts is greyed out).

Would like to land this long-running PR so SVG WG can take over maintaining the tests

@gsnedders
Copy link
Member

@svgeesus someone will need to file a new PR with these changes, given none of us can touch @nikosandronikos's repo.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

Successfully merging this pull request may close these issues.

7 participants