You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Hi, I'm studying on SegNet and your codes are excellent for understanding the whole structure of SegNet. While reading your codes, I felt a little confused about the 35th line in inference.py. Could you please tell me, why did you set activation=False in conv_decode4 = conv_layer_with_bn(initializer, unpool_4, [7, 7, 64, 64], is_training, False, name="conv_decode4") and other conv_layer_with_bn in decoder part ? Thank you very much!
The text was updated successfully, but these errors were encountered:
Honestly, I'm not quite sure and I remember wondering about this as well when I implemented the network. Let me look into it a bit and maybe I can give you an answer :)
From the segnet paper I found this in the first paragraph of chapter 3: «No ReLU non-linearity is used in the decoder unlike the deconvolution network [41, 42]. This makes it easier to optimize the filters in each pair.» I'm not sure exactly why, but at least it means that it has been tested and shown to work better I guess.
Hi, I'm studying on SegNet and your codes are excellent for understanding the whole structure of SegNet. While reading your codes, I felt a little confused about the 35th line in inference.py. Could you please tell me, why did you set
activation=False
inconv_decode4 = conv_layer_with_bn(initializer, unpool_4, [7, 7, 64, 64], is_training, False, name="conv_decode4")
and otherconv_layer_with_bn
in decoder part ? Thank you very much!The text was updated successfully, but these errors were encountered: