Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[WIP] Capsnet layers #7391

Merged
merged 45 commits into from Apr 5, 2019
Merged
Changes from 1 commit
Commits
Show all changes
45 commits
Select commit Hold shift + click to select a range
0fce147
Merge pull request #1 from deeplearning4j/master
rnett Mar 29, 2019
e6dead4
Main capsule layer start, and utilities
rnett Mar 29, 2019
e7e8604
Config validation
rnett Mar 29, 2019
c1c8ccd
variable minibatch support
rnett Mar 29, 2019
fd2458c
bugfix
rnett Mar 29, 2019
83d32c9
capsule strength layer
rnett Mar 29, 2019
7d5f0e8
docstring
rnett Mar 29, 2019
f713125
faster dimension shrink
rnett Mar 29, 2019
b762744
add a keepDim option to point SDIndexes, which if used will only cont…
rnett Mar 29, 2019
7e634ab
keepDim test
rnett Mar 29, 2019
b910cf7
Merge branch 'master' into rnett-point-index-keep-dim
rnett Mar 29, 2019
46445f6
PrimaryCaps layer
rnett Mar 29, 2019
fe35614
fixes
rnett Mar 29, 2019
0a863fa
more small fixes
rnett Mar 29, 2019
39d4f3a
config and shape inference tests
rnett Mar 29, 2019
19b2a00
Merge branch 'master' into rnett-capsnet
rnett Mar 29, 2019
6acbd71
Changed default routings to 3 as per paper
rnett Mar 29, 2019
63ba472
better docstrings
rnett Mar 29, 2019
d70c97e
squash fixes
rnett Mar 29, 2019
151f6cc
better test matrix
rnett Mar 29, 2019
ef3817a
Merge remote-tracking branch 'origin/rnett-point-index-keep-dim' into…
rnett Mar 30, 2019
a6d199e
init weights to 1 (need to use param)
rnett Mar 30, 2019
e22c82c
Proper weight initialization
rnett Mar 30, 2019
6e73717
Undo changes to wrong files
rnett Mar 30, 2019
cabfda5
Single layer output tests
rnett Mar 30, 2019
16c5dc2
MNIST test (> 95% acc, p, r, f1)
rnett Mar 30, 2019
28abe35
need an updater...
rnett Mar 30, 2019
9783c05
cleanup
rnett Mar 30, 2019
6b8246e
Merge branch 'master' into rnett-capsnet
rnett Mar 30, 2019
f39ece0
added license to tests
rnett Mar 30, 2019
a3e7d7a
gradient check
rnett Mar 30, 2019
6214734
fixes
rnett Mar 31, 2019
f8853d3
optimize imports
rnett Mar 31, 2019
f967ed7
fixes
rnett Mar 31, 2019
e6bbab7
fixes
rnett Mar 31, 2019
a777d4c
fix CapsuleLayer output shape
rnett Apr 1, 2019
68fe093
fixes and test update
rnett Apr 1, 2019
4a34a45
typo fix
rnett Apr 1, 2019
285723c
test fix
rnett Apr 1, 2019
75b146d
shape comments
rnett Apr 1, 2019
146ea9f
variable description comments
rnett Apr 1, 2019
6faa787
Merge branch 'master' into rnett-capsnet
rnett Apr 1, 2019
fbd5a83
optimized imports
rnett Apr 1, 2019
3392604
better initialization
rnett Apr 4, 2019
78ddb17
Revert "better initialization"
rnett Apr 5, 2019
File filter...
Filter file types
Jump to…
Jump to file or symbol
Failed to load files and symbols.

Always

Just for now

docstring

  • Loading branch information...
rnett committed Mar 29, 2019
commit 7d5f0e84840b14e2770a3753c4d83814b34087af
@@ -23,6 +23,15 @@
import org.nd4j.autodiff.samediff.SDVariable;
import org.nd4j.autodiff.samediff.SameDiff;

/**
* An layer to get the "strength" of each capsule, that is, the probability of it being in the input.
* This is the vector length or L2 norm of each capsule's output.
* The lengths will not exceed one because of the squash function.
*
* CapsNet is from <a href="http://papers.nips.cc/paper/6975-dynamic-routing-between-capsules.pdf">Dynamic Routing Between Capsules</a>
*
* @author Ryan Nett
*/
public class CapsuleStrengthLayer extends SameDiffLambdaLayer {
@Override
public SDVariable defineLayer(SameDiff SD, SDVariable layerInput) {
ProTip! Use n and p to navigate between commits in a pull request.
You can’t perform that action at this time.