Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Problem when setting Residual=True #1

Closed
bernardohenz opened this issue Oct 19, 2017 · 6 comments
Closed

Problem when setting Residual=True #1

bernardohenz opened this issue Oct 19, 2017 · 6 comments

Comments

@bernardohenz
Copy link

Hello,

I am trying to create a model with residual connections (residual=True). I am using the following command:

from unet import UNet
myModel = UNet((3,256,256),out_ch=3,dropout=0.15,batchnorm=True,residual=True)

but I am getting the following error:

ValueError: `Concatenate` layer requires inputs with matching shapes except for the concat axis. Got inputs shapes: [(None, 3, 256, 256), (None, 64, 256, 256)]

pointing to the last line of conv_block

I am currently using Theano as backend, but I have already tried to pass input shape as channel last (img_shape=(256,256,3)), but got similar error. I even tried to set the axis of the Concatenate, but got no success.

PS: everything works when I use residual=False, but I would really like to use shortcut connections.

@pietz
Copy link
Owner

pietz commented Oct 20, 2017

Truth be told I only tested this with the tensorflow backend and now that theano has reached its end of life there's another reason to use tf.

Try changing both concatenations in the code to axis=1. There's one in the convblock and another in the levelblock.

Also, residual connections are usually an addition and not a concatenation, but this will lead to similar shape problems on the first convblock. The difference should be minor.

@bernardohenz
Copy link
Author

I already tested this and the problem was similar. But I changed to tensorflow backend and it worked.
I think the shape mismatch may be a combination of matching errors among input, convs and concatenate channel-axis, but now it is working perfectly.

Btw, nice code, very clean and easy to use. I would suggest to add the possibility of choosing Batch/Instance Normalization, but Keras does not have Instance normalization yet (only keras-contrib).

Congrats for the work.

@pietz
Copy link
Owner

pietz commented Oct 20, 2017

Thanks for the nice words!

@dscarmo
Copy link

dscarmo commented Jun 22, 2018

Hello, i have the same problem with the code as is on the repository, except it occurs anytime i use a input_shape not multiple of 2.

(128, 128, 1) works, (512, 512, 1) works, but other shapes, even the one used on the original paper (572, 572, 1) will output the same error.

For 181, 181, 1, i have the following output:

ValueError: A Concatenate layer requires inputs with matching shapes except for the concat axis. Got inputs shapes: [(None, 45, 45, 256), (None, 44, 44, 256)]

I tried using axis=1 in concatenates, the error changes to:

I also tried changing start_ch, depth and inc_rate to no avail. output:

ValueError: A Concatenate layer requires inputs with matching shapes except for the concat axis. Got inputs shapes: [(None, 45, 45, 256), (None, 44, 44, 256)]

The code is very nice and clean, good work!

For now i will try using center patches of 128x128.

@sakvaua
Copy link

sakvaua commented Jul 5, 2018

@dscarmo Original paper uses valid convolutions while this implementation uses same. With your shapes
572/2=286/2=143 which is not divisible by two. And that's why it gives you an error. Your images should have shapes that divide by two as many times as the depth of your network.

@pietz pietz closed this as completed Sep 11, 2018
@karanpathak
Copy link

karanpathak commented Feb 17, 2020

Very well written code!

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

5 participants