New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
[css-color-6] contrast-color() and gamut mapping #8539
Comments
Not sure this is sufficient. I think we should ensure that the color the calculation is performed on is the same color that is being displayed, so the spec should probably say that the color is gamut mapped first to the output device gamut. |
I agree with @LeaVerou the luminance of the gamut mapped color should be used, and this can easily be different to merely clamping the luminance of the oog color. |
Does this require CSS to have a function to gamut map a color to the display or can it be done internally in |
I wondered about that, the used color is not typically exposed (some fingerprinting risk), so that would be an advantage of a built-in |
I wonder: if fingerprinting is a risk of gamut mapping against an arbitrary gamut, wouldn’t the risk be present in any function that uses it under the hood, including Is it a goal of the WICG Color API to ever provide similar gamut-mapping function to the user? If so, should all instances of gamut mapping, (or at least the directly/indirectly accessible functions) work against a list of predefined, approximate gamuts, a la |
I think that preventing any API that has some slight risk of fingerprinting is an anti pattern and instead the potential drawbacks should be weighed against the potential benefits. We see this in HDR where knowledge of the ambient light level is needed to calculate the HDR headroom but there is a fingerprint worry so ok we can't expose that and in consequence the user can't see the HDR image or video at all because the sun is shining but hey at least we preserved 0.1 bits of fingerprint entropy. Getting back to the specific question - yes I think Color API should provide that function, which will be way more consistent, simpler and likely more efficient than forcing users to implement it themselves. But such things should be raised and discussed on the Color API repo |
What @svgeesus said, plus I think as a general principle, when a decision involves a tradeoff between accessibility and fingerprinting avoidance, we should err on the side of a11y. To summarize, the proposal we need resolution on is: Before calculating contrast for any color pair, regardless of contrast algorithm, gamut map both colors to the screen gamut. Side thought: if CSS has access to this operation, I wonder if it may make sense to expose it to authors somehow, e.g. so they could create design systems based on actually displayable accent colors. But that's a topic for another issue. |
I am fully behind the proposal because the alternative (calculate the contrast with the specified color, even if it is out of gamut of the display or indeed not a visible color at all), makes no sense whatsoever. |
The CSS Working Group just discussed
The full IRC log of that discussion<emeyer> chris: We can either say you do contrast between the color you can see and another color, or we say it’s between something that might not be a color and some other color<emeyer> lea: We need to hear from implementors whether this is possible <emeyer> chris: I did touch on that in the issue; we don’t expose the used value of a color and don’t need to for this <emeyer> lea: Not concerned about fingerprinting, I’m concerned about whether all ODes can do this <emeyer> TabAtkins: They’ll have what they used to present to the monitor <emeyer> lea: Do colors get gamut-mapped to something they can present? <Rossen_> q? <emeyer> TabAtkins: Not sure <emeyer> lea: We could conditionally resolve, saying we’ll do something if it’s actually implementatble <chrishtr> hi <emeyer> chrishtr: What’s the actual proposal here? <emeyer> chris: To use the actual displayed color when calculating contrast <TabAtkins> (As much as that information is available.) <fantasai> +1 to the proposal <emeyer> chrishtr: So gamut mapping should be taken into account when doing this? <emeyer> lea: The rationale being otherwise we could get calculation between two non-existent colors <emeyer> Rossen_: It seems like this makes the most sense <emeyer> …The question is whether you can get those, and how much work it will take to do so <fantasai> s/colors/colors. Otherwise you might decide there's enough contrast, but after gamut-mapping there isn't./ <emeyer> …In terms of intended functionality, what Chris is proposing here makes total sense <emeyer> …If implementors say this is impossible, we should drop the entire thing <fantasai> +1 <emeyer> chrishtr: I’d like to wait a week to resolve <emeyer> Rossen_: That objects to what makes sense <lea> +1 <emeyer> …So let’s resolve on what makes sense, and if it turns out to be impossible, we revisit <emeyer> RESOLVED: To use the actual displayed color when calculating contrast |
I discussed with Chromium implementation experts. They believe we will be able to implement a My suggestion is to word the spec accordingly. How about:
|
@chrishtr Great to hear that there are no implementation hurdles around this! I noticed you’re using "take into account OS and hardware gamut mapping" both in your comment, as well as your proposed spec text. Does this mean they plan to take it into account in a different way than what is proposed here? If not, I would prefer to use the more precise wording that implementations should gamut map both colors in a color pair to the gamut of the output device prior to applying any contrast calculations. Please note that the decision in this issue doesn't just affect Level 5 @svgeesus @danburzo I just realized an issue with what we're discussing. There can be multiple output devices, and CSS functions cannot return different used values in different output devices (can they?), not to mention the same element, with e.g. a flat background-color can be displayed across multiple screens at the same time. We also cannot ask implementations to gamut map to the intersection of the gamuts, a) because that's nontrivial to calculate and b) because this produces poor user experience (imagine connecting an auxiliary low quality screen to get some more space, and suddenly this affects your viewing experience everywhere!). Perhaps we can just use the gamut of the primary display device, which should be more stable? |
I don't think it's necessary to require implementations to compute the actual value in the gamut in the output device. They just need to somehow ensure contrast. One way to potentially do that without computing the true device-gamut value would be to choose a contrasting value in the internal color space that has contrast, and analyze the hardware to ensure that whatever adjustments to that color which may occur won't make the color no longer contrasting.
Right. Nevertheless I think it suffices to ensure that the chosen color has accessibility contrast for the user when they look at it. Do you agree? |
I might be off here, but it sounds like there's an underlying assumption that there is only a small delta that gamut mapping can cause, and as long as you choose a color that is sufficiently contrasting, that delta won't affect it, so implementations would just need to make sure they don't pick a color that's borderline. However:
I agree that this the end goal, but specs usually need to describe the how, not just the high level goal. There are cases where certain details are left up to the UA, but typically there needs to be a reason, because the more details are left up to the UA, the more inconsistencies there are across browsers, and authors suffer as a result. If there is a better/faster algorithm to do this than gamut mapping both colors first, it's much better to talk it through and actually put it in the spec so that all implementations could benefit :) |
You make a good point that there might be more UA inconsistencies as a result. However, I think there likely is not an algorithm that interoperably takes into account all OS/monitor hardware aspects, because monitors vary in their implementation. At this point, though, we're (or Chromium is, at least) at the experimental phase. Chromium intends to prototype the level 5 black/white |
I have little experience with the color management pipeline, and I’m not a habitual user of secondary output devices, but a quick test of the |
@danburzo I believe MQs are different, functions need to return an actual value, and I'm not sure that value can be different depending on the output device. If it can be, perfect, but we should check with implementors. |
Some color syntaxes allow out-of-gamut colors that are retained as-is until they're ultimately gamut-mapped for display. Examples include
rgb(2 2 2)
andcolor(srgb 2 2 2)
. Luminance contrast computation should ideally reflect the contrast of the colors being displayed.To that end, I believe the CIE Y Luminance could be clamped to
[0, 1]
in the computation ofwcag2
contrast, and other contrast keywords / functions.The text was updated successfully, but these errors were encountered: