You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
In conv2d block function the x coming out from activation function after first layer should go in the second convolution layer instead of original input tensor.
The re-written code for second layer in the function should be
# second layer
x = Conv2D(filters = n_filters, kernel_size = (kernel_size, kernel_size),\
kernel_initializer = 'he_normal', padding = 'same')(x) #this was the change
if batchnorm:
x = BatchNormalization()(x)
x = Activation('relu')(x)
return x`
Also in the inference pipeline you are using using contours from ground truth of validation set and plotting them on the predicitions of validation set which makes so sense.
That was a noob way to show good predictions.
The text was updated successfully, but these errors were encountered:
Thanks for pointing out the mistakes.
I will try to correct them as I get sometime.
By the way these are just practice projects which I have done for self learning and I completely agree that there can be some mistakes in them. I do not have any intention to show great predictions.
Also, I agree for your first point, however there is no problem with the inference pipeline. The contour (boundary) has to be from the ground truth to show how much the predictions overlap with the real boundary. Clearly the validation set predictions are not 100% correct.
In conv2d block function the x coming out from activation function after first layer should go in the second convolution layer instead of original input tensor.
The re-written code for second layer in the function should be
Also in the inference pipeline you are using using contours from ground truth of validation set and plotting them on the predicitions of validation set which makes so sense.
That was a noob way to show good predictions.
The text was updated successfully, but these errors were encountered: