Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

training a net for Euclidean color distance #1

Closed
leeoniya opened this issue Jan 7, 2020 · 10 comments
Closed

training a net for Euclidean color distance #1

leeoniya opened this issue Jan 7, 2020 · 10 comments

Comments

@leeoniya
Copy link

leeoniya commented Jan 7, 2020

Hi @photopea,

i'm on the lookout for js micro-libs and there are not many in the NN category. brain.js is huge, as is convnetjs and others. UNN looks interesting. i'm new to neural nets but would like to try training a NN to approximate color distance for my color quantization library: https://github.com/leeoniya/RgbQuant.js

so far i've tried a basic feed-forward neural net using https://github.com/wpmed92/backpropaganda but the results were not too good (they seemed to mostly match a guassian curve). i'm not sure if my generated training set is too small or my network topology is wrong. i don't have much intuition into what the correct parameters and training set size should be. the general advice i've heard is the hidden layer size should be half-way between the input and output layer sizes. previous discussion here: wpmed92/backpropaganda#13

maybe you would you be interested in sharing some of your wisdom or advice?

thanks!
Leon

@photopea
Copy link
Owner

photopea commented Jan 7, 2020

Hi Leon, I never heard of doing quantization with neural networks, and I am not sure if it is the right approach.

But I made a color quantization library in the past, which is very fast and gives very good results. Look here: https://github.com/photopea/UPNG.js

There is a method UPNG.quantize(data, psize), which reads colors from "data" and returns a pallette with "psize" colors. Even if there are only 100 unique colors in your image, and you want to reduce it to 10, you should provide the whole image as "data", since the quality of quantization depends not only on the unique colors, but also on their frequency.

You can also try it yourself here: http://upng.photopea.com/ , it accepts PNG and JPG images.

@photopea
Copy link
Owner

photopea commented Jan 8, 2020

I just made a demo comparing your tool and my tool with 256 colors (first is UPNG.js) :)
kitten_pp
kitten_RgbQuant

@leeoniya
Copy link
Author

leeoniya commented Jan 8, 2020

haha, nice. we're pretty close! i'll def check it out in the next few days.

btw, NeuQuant uses a NN to do its thing, but the results are not great
https://scientificgems.wordpress.com/stuff/neuquant-fast-high-quality-image-quantization/

RgbQuant does not do well with low palette counts and often needs some help with parameter tweaking. component-weighted euclidean distance is not great, and i wanted to train a neural net to avoid doing all the slow math required for deltaE in the ciecam02-ucs color space (Jab): https://gramaz.io/d3-cam02/

i've been impressed with Wu v2, but there's no JS (or wasm) port: https://gist.github.com/bert/1192520

also an offshoot of RgbQuant with a lot of other algos is here: https://github.com/ibezkrovnyi/image-quantization

@photopea
Copy link
Owner

photopea commented Jan 8, 2020

I guess it depends on how you define the quality of quantization. If you just want the smallest difference between the old and a new color of each space, or you also want to reduce noise, etc.

I think my quantization is similar to Wu v2. I used kD trees to split the 4D color space based on variance. The tree is then used to find nearest neighbours in logarithmic time per pixel.

@photopea photopea closed this as completed Jan 8, 2020
@photopea photopea reopened this Jan 8, 2020
@leeoniya
Copy link
Author

leeoniya commented Jan 8, 2020

ok, wow. UPNG's quantizer is both excellent, and extremely fast. i'm going to play more with it.

from the faq...

To get one common palette for multiple images (e.g. frames of the animation), concatenate them into one array data.

is there a way to do this iteratively rather than all at once? a lot of frames concat'd together will get big very quickly.

from photopea/UPNG.js#21:

I implemented Floyd-Steinberg dithering in the past, but it was not worth it.

i see that discussion mostly revolved around how well it can be compressed, since that's UPNG's primary purpose, but i'd definitely be interested in the option of toggling an error diffusion, ordered, pattern or riemersma dithering kernel. it may not compress well, but i've seen plenty of instances where a Floyd dither is much more desirable than finer-but-still-visible banding. any chance of this happening?

@photopea
Copy link
Owner

photopea commented Jan 8, 2020

My current implementation goes through all pixels in one go, I don't think there is an easy way to re-implement it into multiple calls. Even though Chrome has a limitation to 4 GB of RAM, it should be enough to have hundreds of 1000x1000 pixel frames.

You are right, my goal is to keep files smaller, and in my experiments, a dithered image on 20 colors was bigger, than non-dithered image on 40 colors. I think it would require coming up with compression-friendly dithering.

But you can use UPNG.js only to get a palette, and generate dithered images yourself.

@photopea photopea closed this as completed Jan 8, 2020
@leeoniya
Copy link
Author

leeoniya commented Jan 8, 2020

But you can use UPNG.js only to get a palette, and generate dithered images yourself.

would this not require a good color distance function? (hence this issue). i guess you feel this is good enough?

https://github.com/photopea/UPNG.js/blob/master/UPNG.js#L927

@photopea
Copy link
Owner

photopea commented Jan 8, 2020

As I said before, it depends on how you define the quality of quantization. How would you like to define a difference between pixels? I think the mean square error between RGBA channels is quite good.

@leeoniya
Copy link
Author

leeoniya commented Jan 8, 2020

As I said before, it depends on how you define the quality of quantization.

i'm not sure this applies here. quantization quality is subjective as it can be optimized for different things, i feel like color distance is much more objective - a good algo will identify perceptually close colors properly (in terms of human vision), or it won't. raw MSE is good (not great), but probably a reasonable perf trade-off. might be worth considering at least weighting the components using rec. 709 coefficients. maybe resulting palettes produced by UPNG are sufficiently diverse for this method not to make any obvious mistakes. i'll do some poking.

thanks for the discussion and sorry for the issue going somewhat off-topic.

Photopea is quite the masterpiece, i didnt realize UNN was written by the same author and the fact that i wanted to use it for quantization was pure coincidence!

cheers!

@photopea
Copy link
Owner

photopea commented Jan 8, 2020

Thanks! I don't think other ways of comparing colors would produce better results. But it depends on how you define "better".

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants