-
Notifications
You must be signed in to change notification settings - Fork 278
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Optimizer initialization issue in DeepLabv3+ #21
Comments
Thank you for suggesting the revision. As far as I can see the last snippet, I think the issue is related to the improved ASPP module in the v3+ rather than the non-biased conv in ResNet. The yielding of the ResNet part is done in the "1x" scope without causing the NoneType error. The script |
I think you are right! |
class _ConvBatchNormReLU(nn.Sequential): |
Do you mean the _ConvBatchNormReLU in v3+ ASPP? I have mentioned above:
The non-biased conv is from the official implementation. And the init part is just for v2 here. |
Sorry to bother!
Recently, I try to use DeepLabv3+ and train the new model.
Also, I'm very thank that you can provide the code of model.
However, there is some error that will occur:
I think the issue is that the bias term in first convolution layer is set as False.
This is the default setting in standard ResNet.
However, the initialization part will yield the bias term into SGD constructor.
Hence the SGD raise Exception since the param is Nonetype.
Here is the part of the SGD source:
I give some advice at the end!
Maybe we can add some constraint to check if the bias term is None in
train.py
.Just like the following:
After this small revision, the code can run normally.
The text was updated successfully, but these errors were encountered: