Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Why do you normalize each img's pixel value to -1.6~1.6 when training and testing? #8

Open
yangtian62 opened this issue Dec 14, 2018 · 5 comments

Comments

@yangtian62
Copy link

Hi, thanks for your inspiring work! Just one point I don't understand; why do you set
img = np.array(img, np.float32).transpose(2,0,1)/255.0 * 3.2 - 1.6
when training? Is there any advantages for this operation?

@zlckanata
Copy link
Owner

This operation is to norm the image from [0,255] to [-1.6,1.6], you can norm the image to other range(better zero centered), [-1,1] or [-1.2,1.2] are fine :)

@yangtian62
Copy link
Author

Thanks for your reply~ Is 1.6 a particular number that works the best in your case? And why don't you apply a std to each channel? Cause people always set mean and std at the same time. Haha~

@zlckanata
Copy link
Owner

In dlinknet, each conv-layer is followed by a batchnorm(conv-bn-relu), and, in fact, there's no difference between [-1.6,1.6] and [-1,1](or even [0,255]) in this case.
However, Unet(weight initialized by pytorch default initializer) without batchnorm do have better result using [-1.6,1.6] normalization. In this case, [-1.6,1.6](compare with [-1,1]) is equal to enlarge the initial learning rate and std of the initializer.
In dlinknet, channel-wise normalization equals adding two parameters weighting the RGB. It may have better result(I'm not sure), but I don't think this is so crucial~ 2333

@yangtian62
Copy link
Author

Alright, thanks again. Happy New Year btw~ Lol

@zlckanata
Copy link
Owner

Happy New Year~ www

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants