New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
White-Box model results are not good #2
Comments
Yes, I also think the white-box model not as good as the official version.
When I re-implementing, I spend a lot of time ensuring the Now I try to use the same hyperparameters training both steps in the official version and my version. This model training costs a lot of time in superpixel, So I need some time to test. If you can help find which code has a problem, I will be very grateful. |
thanks for your quick response! I have not started the training yet. I will let you know if I figure out something. By the way, which superpixel method did you use in your training. I wonder how much impact the superpixel method has on the results. |
My default superpixel method during training is consistent with the one mentioned in the author’s paper. He uses the superpixel method of adaptive brightness to increase the brightness of the output image, and the parameters I use are consistent with the official code, |
Well done! It seems the pytorch version's results are smoother than the official tf version. What changes did you make in the pytorch version? I acutally like the pytorch version's results better. Can you update your repo and release the updated trained weights? Again, great work! |
Hi. I add new weights in google drive, you can find in the readme. I also upload tensorflow version weights named |
I found the strange color caused by |
the author said you could train without guided filter and add guided filter during inference. |
ok. I will try if time permits |
One thing that you could try with the colors is to use Color Transfer algorithm, like this from PyImageSearch. Also, regarding cartoon noise, the guided filter should help on the post-process, by using lower values of epsilon (ε), like in WhiteBox's cartoonize.py. |
@GustavoStahl thanks, I missing the test_code ε value is not equal to train_code ε. But I don't think the color transfer algorithm is needed. For this model, it needs to keep the original color as much as possible and only increase the brightness. Regarding the degree of texture, I think it can be adjusted g_gray_weight. like this. |
adding np.clipping() after guided filter would solve the artifacts |
Great work in re-implementing the white-box model using pytorch. However, after testing, I found the results are not good as the official version by the authors. There is still a large gap. What do you think could be the reason? Do we need the train the model longer or something else?
The text was updated successfully, but these errors were encountered: