Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Small question about Xnet vs Nestnet difference #31

Open
CozyDoomer opened this issue Nov 14, 2019 · 5 comments
Open

Small question about Xnet vs Nestnet difference #31

CozyDoomer opened this issue Nov 14, 2019 · 5 comments

Comments

@CozyDoomer
Copy link

CozyDoomer commented Nov 14, 2019

The only architectural difference between the two seems to be this:

  • nestnet
    skip=interm[(n_upsample_blocks+1)*i+j]
  • xnet
    skip=interm[(n_upsample_blocks+1)*i: (n_upsample_blocks+1)*i+j+1]

in the upblock parameters.

can someone confirm this and maybe quickly explain how this affects the model?
would they both count as Unet++ architecture according to the paper but with different skip connections?

@zsk-tech
Copy link

I think xnet is the real UNet++. nestnet doesn't connect current layer to all previous layers . Therefore,the accuracy of the nestnet is not as high as xnet.
I think nestnet is wrong. And xnet is a modified version of nestnet.

And I think there is also something wrong with xnet,even if I'm not sure.
For example,In xnet , the author reverse its order when using decoder_filters=[256 128 64 32 16].
So the number of filters the author wants to use should be [16 32 64 128 256].
But the number of filters the author actually uses is [32 64 128 256 16].
And I also found other things(I think it‘s wrong) that I don’t understand very well.

@CozyDoomer
Copy link
Author

Hey @zsk-tech thank you for your input.

I agree that the connections are different..
but that does not mean it is necessarily 'wrong' it could just be a version of Unet++ with less skip connections but still working correctly right?

What we can say is it makes a difference in the number of trainable parameters.
E.g. for resnet18:

# xnet resnet18
Total params: 18,273,898
Trainable params: 18,261,156
Non-trainable params: 12,742

# nestnet resnet18
Total params: 17,462,890
Trainable params: 17,450,148
Non-trainable params: 12,742

I was also confused by the filters of the additional decoder skip connections.

Where did you see the author using [32 64 128 256 16] number of filters?
For me when I look up the filters of the layers that are skip connections it looks like this for resnet18:

stage4_unit1_relu1 (Activation) (None, 16, 16, 256)
stage3_unit1_relu1 (Activation) (None, 32, 32, 128)
stage2_unit1_relu1 (Activation) (None, 64, 64, 64)
relu0 (Activation)              (None, 128, 128, 64)
relu1 (Activation)              (None, 8, 8, 512)
stage3_unit2_relu1 (Activation) (None, 16, 16, 256)
stage2_unit2_relu1 (Activation) (None, 32, 32, 128)
stage1_unit2_relu1 (Activation) (None, 64, 64, 64)

But maybe you looked into what actually happens in build_xnet and that's why I'm confused about that statement.

@zsk-tech
Copy link

Thank you for your explanation.
I think what you said is right. Xnet and nestnet are just two different versions.

Q: Where did you see the author using [32 64 128 256 16] number of filters?

A: I derived it from the code in build_xnet.But it is also possible that my understanding of the code is incorrect.

@CozyDoomer
Copy link
Author

Okay that makes sense, I'll try to look at it when I find the time and report back!

The build_xnet function it's not easy to unpack sadly, maybe the authors find time to answer in the meantime :)

@zsk-tech
Copy link

OKay

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants