Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Ошибка в калькуляторах расстояний #4

Open
Nommyde opened this issue Mar 30, 2016 · 22 comments
Open

Ошибка в калькуляторах расстояний #4

Nommyde opened this issue Mar 30, 2016 · 22 comments

Comments

@Nommyde
Copy link

Nommyde commented Mar 30, 2016

В ManhattanSRGB и EuclideanRgb... методах r, g, b умножаются на коэффициенты
RED = .2126, GREEN = .7152, BLUE = .0722

В этом нет смысла, так как эти коэффициенты применимы к линейным CIE RGB, а не к sRGB (в котором все изображения в вебе). Появится смысл, если эти коэффициенты перевести в sRGB, тогда они примут вид:
RED = 0.4984, GREEN = 0.8625, BLUE = 0.2979

Либо если не умножать компоненты на коэффициенты, а просто перевести пиксель из sRGB в CIE RGB и там уже мерить евклидово расстояние (но в таком случае потеряется смысл с точки зрения человеческого восприятия)

Также эти коэффициенты используются в методе getLuminocity, это тоже неверно. Там лучше взять формулу
Luminance (perceived option 1): (0.299*R + 0.587*G + 0.114*B) [2]

Еще точнее будет просто взять координату Y из CIE XYZ, но это долго вычисляется

@leeoniya
Copy link

Igor probably got those numbers from the original RgbQuant source [1]. I got those coefficients from Rec 709 [2] for RGB.

@Nommyde, where are you getting your coefficients? I could not find them anywhere. Do they look more accurate in your testing? Also, if sRGB is non-linear, then how would different (but still static) coefficients be any more accurate?

[1] https://github.com/leeoniya/RgbQuant.js/blob/master/src/rgbquant.js#L703
[2] https://en.wikipedia.org/wiki/Rec._709

@Nommyde
Copy link
Author

Nommyde commented Mar 30, 2016

@leeoniya i get these coefficients by applying srgb conversion formula (see Csrgb).

Do they look more accurate in your testing?

Yes. And if you scale rgb levels with these coefficients and look at same levels (black rectangle)
srgb
colors will have equivalent lightness. But if you scale with Rec.709 coef, lightness will different.

if sRGB is non-linear, then how would different (but still static) coefficients be any more accurate?

Non-linear in energy, but approximately linear for human perception, so it can be scaled linearly

@Nommyde
Copy link
Author

Nommyde commented Mar 30, 2016

@leeoniya
Rec.709 coeffs applied:
srgb2

Red and blue are lighter than green in same points.

If coeff. will too small (like linear 0.0722), small difference in corresponding coordinate will make big color change, but same difference in other coordinate will make small color change.

@leeoniya
Copy link

@Nommyde

Interesting, thanks!

I'll play around with the adjusted coefficients in in RgbQuant as well and see how things look. Though RqbQuant uses a component-scaled Eucledian distance [1] which got better results than a simple scaled sum.

[1] http://alienryderflex.com/hsp.html

@Nommyde
Copy link
Author

Nommyde commented Mar 30, 2016

@leeoniya in your project luma and distance are calculated with square of components (looks like fast average of gamma correction) so Rec.709 coeffs are more suitable. But this project has no gamma correction

@Nommyde
Copy link
Author

Nommyde commented Mar 30, 2016

@igor-bezkrovny кстати, ты тоже можешь попробовать вместо изменения коэффициентов применить быструю гамма-коррекцию, возведя в квадрат компоненты, как у @leeoniya

@leeoniya
Copy link

oh, i see.

The Manhattan option [1] doesnt use squares though, so maybe it's worth using the adjusted coefficients for that.

[1] https://github.com/leeoniya/RgbQuant.js/blob/master/src/rgbquant.js#L733

@Nommyde
Copy link
Author

Nommyde commented Mar 30, 2016

may be, try them :)

@leeoniya
Copy link

The quality that image-quantization gets with Wu v2 w/alpha + Riemersma is just astounding. IMO nothing beats it in any of the test cases I have tried. And it gets these results with no tweaking needed. The perf could use a bit of work...perhaps with asm.js or WebAssembly or even better by offloading to WebGL shaders & Web Workers.

btw, until i poke my server. working demos are here: http://leeoniya.github.io/RgbQuant.js/demo/

@Nommyde
Copy link
Author

Nommyde commented Mar 30, 2016

nothing beats it

@leeoniya, среди методов этого проекта или вообще среди известных тебе?

@leeoniya
Copy link

Between all that i've seen before giving up :) Including various combinations of [1] [2].

Maybe there is something better, but I would have a hard time determining if it was actually better or just "different".

[1] http://www.codeproject.com/Articles/66341/A-Simple-Yet-Quite-Powerful-Palette-Quantizer-in-C
[2] http://bisqwit.iki.fi/story/howto/dither/jy/

@Nommyde
Copy link
Author

Nommyde commented Mar 30, 2016

http://igor-bezkrovny.github.io/image-q/demo/0.1.4/index.html тут я могу выбрать эту лучшую комбинацию? или это устаревшая демка?

@leeoniya
Copy link

quant_combo

@Nommyde
Copy link
Author

Nommyde commented Mar 30, 2016

Я думаю при малом количестве цветов лучше всех работает scolorq
Оригинал:

image

Wu: 4 colors

image

scolorq: 4 colors

image

http://bisqwit.iki.fi/jutut/colorquant/index4.html
http://www.cs.berkeley.edu/~dcoetzee/downloads/scolorq/
http://www.ximagic.com/q_results_kodim04_16.html

@leeoniya
Copy link

In this case, yes.

When the color counts are very low, the variation is huge not only between libraries but also between individual samples. At this point you're just choosing what you personally prefer for your specific workload, since every answer is very wrong, just along different axes, lol.

I think for >= 16 colors Wu + Riemersma is very hard to beat over a diverse workload. Of course I could be wrong :)

@Nommyde
Copy link
Author

Nommyde commented Mar 30, 2016

Даже при 16 цветах мне кажется scolorq точнее :) Но я не тестировал его производительность, подозреваю что очень долго работает

wu - original - scolorq
image

@leeoniya
Copy link

не спорю :)

@leeoniya
Copy link

However, the demo also doesn't have a Wu version with component coefficients which often have a large impact and may fix that shift towards red in the wu version.

@ibezkrovnyi
Copy link
Owner

ibezkrovnyi commented Jul 21, 2016

Regarding original post:

There is an opinion sRGB luminosity should be calculated as follows:

  1. in Russian
  2. more verbose, in English
Name Red Green Blue
NTSC RGB 0.298839 0.586811 0.114350
CIE RGB 0.176204 0.812985 0.010811
sRGB 0.212656 0.715158 0.072186

Coefficients Luminance (perceived option 1): (0.299*R + 0.587*G + 0.114*B) [2] are from standard of 1953 for NTSC/Phosphor displays - see this

So, it seems that best coefficients for Y of sRGB images are RED = .2126, GREEN = .7152, BLUE = .0722.

@Nommyde @leeoniya

@Nommyde
Copy link
Author

Nommyde commented Jul 22, 2016

@igor-bezkrovny anyway calculating Luma as linear combination of non-linear sRGB components (ignoring gamma) is bad idea if you need high precision. It is just approximation.
And i was wrong, when i say that bt.601 coeffs are better, but bt.709 coeffs are not best too.
For example, if you compare Y calculated with bt.709 coeffs and Y calculated with bt.601 coeffs with more accurate L* component you will see approximately same errors. But bt.601 works better with dark colors, and bt.701 with light colors.
here is code https://jsfiddle.net/Nommyde/3Lxe10pp/
So there are no ideal coeffs for calculating Y this way.

But in Manhattan algorithm i think my coeffs will be better:) try them on real pictures

@ibezkrovnyi
Copy link
Owner

ibezkrovnyi commented Jul 25, 2016

For example, if you compare Y calculated with bt.709 coeffs and Y calculated with bt.601 coeffs with more accurate L* component you will see approximately same errors. But bt.601 works better with dark colors, and bt.701 with light colors.
here is code https://jsfiddle.net/Nommyde/3Lxe10pp/
So there are no ideal coeffs for calculating Y this way.

Yes, but the best standard now is bt.709, and the only standard for sRGB.

But in Manhattan algorithm i think my coeffs will be better:) try them on real pictures

I temporarily added ManhattanNommyde color distance method here: http://igor-bezkrovny.github.io/image-q/demo/1.0.1/index.html.
It uses coefficients 0.4984 * R + 0.8625 * G + 0.2979 * B. You can play with different images comparing it with ManhattanSRGB and other color distance methods.

However:

  • sum of coefficients is not 1.0
  • looks like some images looks worse, other - nothing noticeably changed
  • when I add these coefficients to https://jsfiddle.net/Nommyde/3Lxe10pp/, errors are ~4 times worse than bt601/bt709.

@Nommyde
Copy link
Author

Nommyde commented Jul 26, 2016

@igor-bezkrovny in manhattan algorithm sum of coeffs must not be 1.0 and not applicable in this script (https://jsfiddle.net/Nommyde/3Lxe10pp/). because in simple manhattan sum of coefs is 3.0 :) and it is not a problem.
but if you want, you can scale them (divide by 1.6588), result will not change.

simple manhattan:

image

nommyde manhattan:

image

sRGB manhattan (looks wrong):

image

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

3 participants