-
Notifications
You must be signed in to change notification settings - Fork 545
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
The layer before the auxiliary loss layer #21
Comments
I've tried with {ProjectionConv, BN, Relu, Dropout, Conv2d->NumClasses} as an aux branch. The results were 'meh'. |
@authman Hi, I don't think that is what they used. I actually loaded up their released weights using their deploy prototxt and inspected the "ignored layers" output by caffe.
Apparently it was a conv/bn/relu/dropout block (the Personally I think the key to achieving their stellar accuracy is in finetuning the BN parameters on VOC across multiple GPUs. That is also confirmed by deeplab-v3 which did the BN trick and obtained almost the same accuracy as PSPNet. |
@qizhuli: do you know where can i get deeplabv3 source code? I did not find it. Thanks |
@mjohn123 I don't think they have released it yet. |
Hi @hszhao, what is the layer setting before the auxiliary loss layer? A simple 3x3 conv layer or another pyramid pooling module?
The text was updated successfully, but these errors were encountered: