Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Chrome Dev & Edge Dev #96

Closed
jensimmons opened this issue Jul 19, 2022 · 7 comments
Closed

Chrome Dev & Edge Dev #96

jensimmons opened this issue Jul 19, 2022 · 7 comments
Labels
meta Process and/or repo issues user-interface Presentation of Interop scores

Comments

@jensimmons
Copy link
Contributor

I just ran across yet another page on Can I Use where Chrome and Edge have different support for a thing. (One supports it, the other doesn't.) Now that Can I Use put Chrome and Edge desktop browsers next to each other, it's very easy to see such differences jump out.

Which led me to wonder — why do they have the same column on the Interop dashboard? Are we in fact testing both? Or are we testing Chrome, and just labeling that score as "Edge, too. Must be the same."

@foolip
Copy link
Member

foolip commented Jul 20, 2022

@foolip
Copy link
Member

foolip commented Jul 20, 2022

When it comes to differences between Chrome and Edge in Can I Use or BCD, most such differences are probably incorrect, at least in BCD. Improving on that is part of openwebdocs/project#119, in particular openwebdocs/project#85.

@foolip
Copy link
Member

foolip commented Jul 20, 2022

I've filed web-platform-tests/wpt#34913.

@foolip
Copy link
Member

foolip commented Jul 20, 2022

Yes, there certainly are differences, some might be persistent even and wouldn't go away with time.

@dlibby- @kypflug do you have thoughts on our current setup?

Also cc @jgraham given web-platform-tests/wpt.fyi#1519.

@dlibby-
Copy link

dlibby- commented Jul 21, 2022

I'm not very familiar with the setup, but sampling the failures, I see these broad categories:

  • Incomplete results (looks tracked by Incomplete Edge Dev results wpt#34913, Mustapha is the best person to comment on those)
  • Legitimate differences (i.e. the modal dialog tests, not sure why we have a difference there, but filed an internal bug for the owning team to follow up)
  • Anti-aliasing pixel differences.
  • Successive Ahem 'X' characters aren't creating a solid rectangle in some of the tests.

For the last two I can reproduce the same results in Chrome on Windows. The anti-aliasing differences I spot checked are a bit odd, given that these are all reftests, AFAICT.

Given that Edge is the only Windows-based run, it's possible that there are platform-specific differences or bugs for Chromium that are causing these failures. The interesting bit to me is that we have these tests running in both Chromium and Edge CI and change validation and don't see the same failures on Windows. So perhaps there's a test configuration (intuition says something related turning off font anti-aliasing) that causes these to show up.

I'm going to be out for the next few days, but I'll follow up with more detailed investigation and finish combing through the failures next week.

@foolip
Copy link
Member

foolip commented Feb 15, 2024

This has been resolved in https://wpt.fyi/interop-2024 by separating Chrome and Edge.

@foolip foolip closed this as completed Feb 15, 2024
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
meta Process and/or repo issues user-interface Presentation of Interop scores
Projects
None yet
Development

No branches or pull requests

4 participants