Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Relu activation in PrimaryCap? #25

Closed
oargueta3 opened this issue Nov 9, 2017 · 3 comments
Closed

Relu activation in PrimaryCap? #25

oargueta3 opened this issue Nov 9, 2017 · 3 comments

Comments

@oargueta3
Copy link

the tf.contrib.layers.conv2d applies a relu activation,the PrimaryCap convolution does not included a relu activation before grouping neurons into capsules and then squashed, or did I miss something from the paper

capsules = tf.contrib.layers.conv2d(input, self.num_outputs * self.vec_len,

@naturomics
Copy link
Owner

In fact, I have the same doubts. I've read the paper again and again, but I can't find out any words in the paper whether PrimaryCaps has a ReLU activation function or not. So I just let it go. Maybe I should write some comments in the code to note that. I'll try it out which is better, and use it as the default.

Thank you for your comment

www0wwwjs1 added a commit to www0wwwjs1/CapsNet-Tensorflow that referenced this issue Nov 9, 2017
Fix naturomics#25, rm relu activation fun from primarycap
@naturomics
Copy link
Owner

@oargueta3 I've done a experiment for this doubts. The results show that using ReLU has a higher test accuracy(batch size 48, 50 epoch,~99.4) than no ReLU(batch size 48, 30 epoch, the best result is 99.1 at step 8600, but most time it maintains at 99.0)

@stoneyang
Copy link

@naturomics , thanks for your work, and it is great and really quick!

It seems that there's no ReLU between capsule and squashing. As mentioned in the third paragraph in Sec. 4 of the original paper, both v1 and v2:

One can see PrimaryCapsules as a Convolutional layer with Eq. 1 as its block non-linearity.

So, I think we may confirm that no ReLU is placed there.

Any suggestions?

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

3 participants