-
Notifications
You must be signed in to change notification settings - Fork 153
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Different eval metrics for model with/without bn #32
Comments
Hey, this is also my question, and I still have no idea about why it happened. Maybe there is a bug in my bn_model, but I can't find it. Another possibility is that the author did not upload the final version of bnnomerge model. If you find any thing helpful to this issue, please tell me, thanks. |
The weights provided by them are fine as I can actually convert the train_30k_bnnomerge weights to train_30k weights by merging the batchnorm layer weights with conv layer weights. I also tried to compare the activations after every bn layer(for bn model) and conv layer(for no bn model) and noticed that the error between them keeps on increasing as we go deeper. My guess is that bn weights are somehow not being loaded properly or something else that is buggy with the bn model. |
Hey, can you tell me how you convert weights by merging the batchnorm layer weights with conv weights? Did you do this using .npy or .caffemodel ? I think maybe the problem is that bn weights not converted properly, I'll check again. |
I used the npy weights. Here's the script I wrote: https://github.com/alasin/python-scripts/blob/master/convert_bn.py |
Thank you for the script. I'll check the method I load bn weight, also the model. |
Hi,
When I evaluate on cityscapes using train_30k and train_30k_bnnomerge models, I am getting different mIOU of 65.6% and 59.3% respectively. As per my understanding, they should ideally give the same results. Am I missing something?
Thanks
The text was updated successfully, but these errors were encountered: