-
Notifications
You must be signed in to change notification settings - Fork 295
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Should the input to the include_top ignore the input to the last denseblock? #23
Comments
Thats not a skip connection. That is a pixel wise classification layer (where each pixel will be convolved over using a point wise convolution operation to classify that pixel). |
I mean the input of the pixel wise classification layer, so call 'x_up', concatenate the input of its corresponding denseblock. It is different from the paper, where the input to the classification layer should not concatenate the input of the denseblock. And following your code, it should be: So I think I may miss something. Could you give me some guidance? |
I'll look into it. If you can figure out of where to correct it, please submit a PR. |
In the densenet FCN paper, the Diagram of Figure1 show no skip connection from input to output of the last denseblock in the decompression path.
But the function '__create_fcn_dense_net' use the skip connection like:
line 770 of densenet.py
x = Conv2D(nb_classes, (1, 1), activation='linear', padding='same', use_bias=False)(x_up)
Dose I mis-understand it? Or this operation is an improved one that I miss?
The text was updated successfully, but these errors were encountered: