normalization #2
Comments
Hi, thank you for pointing it out. But, I'm not sure if one of them is "correct" and the other is "wrong." I guess the effect of the normalization order is negligible and it's just a matter of taste. That being said, since you brought it up, I checked some of the official implementations of papers, and found that both are used. Implementations of ResNet, DenseNet, and ResNeXt apply normalization first and then apply zero padding, horizontal flip and random crop. Implementations of PyramidNet, Cutout and RandomErasing apply zero padding, random crop and horizontal flip and then apply normalization. Implementation of Wide ResNet is a bit different. It applies reflection padding, random crop and horizontal flip and then apply normalization. So, it seems there's no "correct" order of normalization after all. |
You normalize the tensor with mean and std first, for example, in mnist, mean is 0.1307 and std is 0.3081, mean and std are already normalized to [0,1]. suppose a pixel is 240, in pytorch, it produces (240-0.1307)/0.3081. doesn't it look weird? it seems that the order should be normalize the tensor to [0,1] first, and then normalize by mean and std. for example, it should change
to
normalize the tensor with mean and std after normalize to [0,1] |
Oh, I completely misunderstood your point. Thanks for the clarification. Sorry, I think my implementation was a bit confusing, and that leads to your concern. I implemented When using But, in my implementation, |
it's my mistake. thank you. |
the normalization step might be wrong.
https://github.com/hysts/pytorch_image_classification/blob/master/dataloader.py#L134
the correct order is
https://github.com/kuangliu/pytorch-cifar/blob/master/main.py#L35
Any idea?
The text was updated successfully, but these errors were encountered: