Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Regression test for color distance #5

Closed
hastebrot opened this issue Oct 20, 2015 · 21 comments
Closed

Regression test for color distance #5

hastebrot opened this issue Oct 20, 2015 · 21 comments
Labels

Comments

@hastebrot
Copy link

To ensure the color distance is correctly compared to the comparison threshold.

The text fixture could be an image with gray background rgb(0.5, 0.5, 0.5) and an image that contains gray tones with different color offsets (e.g. with rgb(0.6, 0.6, 0.6), which should return a color difference of 0.1).

@hastebrot
Copy link
Author

Example code (https://jsbin.com/xujizi/edit?js,console):

var gray30 = [0.3, 0.3, 0.3, 255];
var gray50 = [0.5, 0.5, 0.5, 255];
var gray60 = [0.6, 0.6, 0.6, 255];
var gray100 = [1.0, 1.0, 1.0, 255];

console.log(Math.abs(colorDelta(gray30, gray60, 0, 0, true))); // 0.3...
console.log(Math.abs(colorDelta(gray50, gray100, 0, 0, true))); // 0.499...

console.log(Math.sqrt(colorDelta(gray30, gray60, 0, 0, false))); // 0.3...
console.log(Math.sqrt(colorDelta(gray50, gray100, 0, 0, false))); // 0.499...

@mourner mourner added the tests label Oct 20, 2015
@mourner
Copy link
Member

mourner commented Oct 20, 2015

Just to be clear, this is working correctly, just needs a test, right?
Pull requests welcome. :) I guess colorDelta could also be exported as pixelmatch.colorDelta.

@mourner
Copy link
Member

mourner commented Oct 20, 2015

Also, RGB values are also 0-255.

@hastebrot
Copy link
Author

colorDelta() works correctly, with grayscale pixels and with colored pixels (the difference between red with alpha of 0 and red with alpha of 255 is 0.299).

For the comparison with threshold or maxDelta I'm not quite sure and need to test the examples with pixelmatch(). Also I'm a bit puzzled why you use 255 * 255 * 3 in

// maximum acceptable square YUV distance between two colors
var maxDelta = 255 * 255 * 3 * threshold * threshold

I've ported the code to Java/Kotlin and changed the RGB values to be within 0..1, used a more computational intensive sqrt(delta) > threshold instead of delta > threshold * threshold and changed the code to use 0.9 as blend factor for white instead of 0.1 (see code below).

// pixels are similar; draw background as grayscale image blended with white
var val = 255 - 0.1 * (255 - grayPixel(img1, pos)) * img1[pos + 3] / 255;

My version (basically a lerp() between luma and 1.0):

// factor = 0.0 leaves the luma unchanged. factor = 1.0 changes luma to total white.
fun blendToWhite(luma: Double,
                 factor: Double): Double {
    //return 1.0 - ((1.0 - luma) * factor) // similar to the javascript implementation
    return ((1.0 - factor) * luma) + (factor * 1.0)
}

Before that the diff images were equal to the images from test/fixtures. After the code changes the error pixels in the diff images were slightly different, even when changing the blend factor code back to previous version.

@hastebrot
Copy link
Author

Another example (https://jsbin.com/nulode/edit?js,console):

var white = [255, 255, 255, 255];
var gray50 = [0.5 * 255, 0.5 * 255, 0.5 * 255, 255];
var gray60 = [0.6 * 255, 0.6 * 255, 0.6 * 255, 255];

console.log(Math.abs(colorDelta(gray50, gray60, 0, 0, true)) / 255); // 0.1
console.log(Math.sqrt(colorDelta(gray50, gray60, 0, 0, false)) / 255); // 0.1

console.log(pixelmatch(gray50, gray60, white, 1, 1, {threshold: 0.1})); // 0
console.log(pixelmatch(gray50, gray60, white, 1, 1, {threshold: 0.057})); // 1
console.log(pixelmatch(gray50, gray60, white, 1, 1, {threshold: 0.058})); // 0

Here the color delta is exactly 0.1, but the threshold is somewhere between 0.057 and 0.058.

@hastebrot
Copy link
Author

But maybe I missed something. It also doesn't work with {threshold: Math.sqrt(0.1)}.

@mourner
Copy link
Member

mourner commented Oct 20, 2015

You need to understand what a color distance is. Distance is sqrt(a^2 + b^2 + c^2), not (a + b + c)/3. That's why the max distance is (255-0)^2 + (255-0)^2 + (255-0)^2 = 255 * 255 * 3. And that's why threshold is 0.57.. and not 0.1.

Closing as this is not an issue and I don't think the colorDelta function needs any tests since it's very simple and I don't expect any regressions there.

@mourner mourner closed this as completed Oct 20, 2015
@hastebrot
Copy link
Author

Ahh, thanks, I see.

The gray example colors caused the color distance to be 0.1, because gray colors only use the Y luma channel in Y'UV, that means calculated distance was sqrt(a^2 + 0 + 0).

There is a small thing I wanted to point out as a side note: For RGB the maximum distance is 255 * 255 * 3, but for Y'UV I'm not sure. The ranges for Y'UV (using BT.601 constants) are

  • Y' from 0.0 to 1.0 (e.g. black and white have these extreme values in their Y' channel)
  • U from -0.436 to 0.436 (e.g. blue and yellow have these extreme values in their U channel)
  • V from -0.615 to 0.615 (e.g. red and cyan have these extreme values in their V channel)

If you use RGB values with a range from 0 to 255 (instead of 0 to 1) you have to multiply them with 255. You see that the V difference of red (-0.615 * 255) and cyan (0.615 * 255) is bigger than 255 - 0. So there are pairs of Y'UV colors that cause a distance that is bigger than max distance (255 * 255 * 3), but since we calculate the Y'UV colors using RGB colors we never get such color distances.

So the real max distance for Y'UV colors calculated using RGB colors might be a bit smaller than the max distance we use. But well, I think even if we have distances that a bigger than max distance it doesn't matter much in practice for the test cases.

@mourner
Copy link
Member

mourner commented Oct 22, 2015

Hmm, yeah, you're probably right. We need to figure out the max and adjust.

@hastebrot
Copy link
Author

The naive approach would be to go through all 255 * 255 * 255 RGB colors convert them to Y'UV and store min and max values of Y', U and V. I wonder what literature says about maximum color distance for the L*A*B color space, which also does not use a cube as color garmut.

@mourner
Copy link
Member

mourner commented Oct 22, 2015

Calculated — the max seems to be just a little bit bigger:

Math.sqrt(Math.pow(255 * 0.436 * 2, 2) + Math.pow(255 * 0.615 * 2, 2) + Math.pow(255, 2))
461.3515927142768
Math.sqrt(255 * 255 * 3)
441.6729559300637

@hastebrot
Copy link
Author

Yeah. So this is the maximum distance for Y'UV colors. Seems to be a better value for max distance.

As noted Math.sqrt(colorDelta(...)) will never return 461.35..., because there are simply no RGB values that gives us the Y'UV colors (0.0, -0.436, -0.615) and (1.0, 0.436, 0.615).

We could go even further with calculations for the max distance, but I think 461.35... is quite good.

@mourner
Copy link
Member

mourner commented Oct 22, 2015

Interesting, I'd be up to looping through all RGB values and determining the max distance, just out of curiousity.

@hastebrot
Copy link
Author

I'd be interested which pair of RGB color triples give us the max distance. This could be computational intensive to loop through all pairs of RGB values.

@mourner
Copy link
Member

mourner commented Oct 22, 2015

@hastebrot no, we'd just calculate the max once and then insert that as a constant.

@hastebrot
Copy link
Author

I get following results (https://github.com/hastebrot/notebooks/blob/master/ipython/yuv-color-range.ipynb):

Max distance between two Y'UV colors (which were converted from RGB colors) is between red and cyan:

min_yuv = (0, -U_MAX * 255, -V_MAX * 255)
max_yuv = (255, U_MAX * 255, V_MAX * 255)
red_yuv = to_yuv(255, 0, 0)
cyan_yuv = to_yuv(0, 255, 255)

print("dist(min, max):", yuv_dist(min_yuv, max_yuv) ** 0.5)
print("dist(red, cyan):", yuv_dist(red_yuv, cyan_yuv) ** 0.5)

# dist(min, max): 461.3515927142768
# dist(red, cyan): 338.2928543947083

I've used 5832 evenly distributed RGB triples, within a distance of 15 per channel.

print("num of colors:", len(colors))
print("color steps:", range(0, 255 + 1, color_step))

# num of colors: 5832
# color steps: range(0, 256, 15)

Max color distances sorted in descending order (we see a clear trend here, so I think my calculations are correct):

338.2928543947083 (0, 255, 255) (255, 0, 0)
338.2928543947083 (255, 0, 0) (0, 255, 255)
334.989353150992 (0, 255, 240) (255, 0, 0)
334.989353150992 (0, 255, 255) (255, 0, 15)
334.989353150992 (255, 0, 0) (0, 255, 240)
334.989353150992 (255, 0, 15) (0, 255, 255)
331.7974276678819 (0, 255, 240) (255, 0, 15)
331.7974276678819 (255, 0, 15) (0, 255, 240)
331.79742766788183 (0, 255, 225) (255, 0, 0)
331.79742766788183 (0, 255, 255) (255, 0, 30)
331.79742766788183 (255, 0, 0) (0, 255, 225)
331.79742766788183 (255, 0, 30) (0, 255, 255)
330.6929739870048 (15, 255, 255) (255, 0, 0)
330.6929739870048 (255, 0, 0) (15, 255, 255)
330.6929739870048 (0, 255, 255) (240, 0, 0)
330.6929739870048 (240, 0, 0) (0, 255, 255)
...

@hastebrot
Copy link
Author

Hmm, there are some doublets, but with different color distances. Maybe a rounding error. 😕

@mourner
Copy link
Member

mourner commented Oct 22, 2015

Cool, thanks! Actually I'm starting to think that YUV distance is not the best metric of human-perceived color difference. CIEDE2000 is considered the best metric (https://en.wikipedia.org/wiki/Color_difference), but it's very computationally expensive: https://github.com/markusn/color-diff. I still want to try it out.

@hastebrot
Copy link
Author

There is also CAM02-UCS which is regarded as excellent model for color distance.

cam02-ucs

@mourner
Copy link
Member

mourner commented Oct 22, 2015

Found this excellent paper that describes a color difference algorithm that's close to CIEDE2000 but very fast to compute: http://www.progmat.uaem.mx:8080/artVol2Num2/Articulo3Vol2Num2.pdf

@hastebrot
Copy link
Author

From the abstract of Kotsarenko et. al.:

The experimental results show that the newly introduced formulas are close in perceived terms to CIELAB and CIELUV but are significantly faster, making them good candidates for measuring color difference on mobile devices and applications even in real-time.

This sounds great.

ReDrUm pushed a commit to ReDrUm/pixelmatch that referenced this issue Sep 20, 2019
…dd-examples-and-gifs-to-readme to master

* commit '0e63081c2a8dd51b42f8bc8edf7e94fc5916719b':
  docs(README): adding gifs to illustrate example usage
ReDrUm pushed a commit to ReDrUm/pixelmatch that referenced this issue Sep 20, 2019
feat(matcher): add noColors options
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
Projects
None yet
Development

No branches or pull requests

2 participants