Skip to content
A Capsule Network-based Embedding Model for Knowledge Graph Completion and Search Personalization (NAACL 2019)
Python Shell
Branch: master
Clone or download
Latest commit adda0ad Dec 15, 2019
Type Name Latest commit message Commit time
Failed to load latest commit information.
data CapsE Jun 11, 2019
runs_CapsE CapsE Nov 3, 2018
CapsE.png CapsE Jun 4, 2019 Update CapsE Feb 21, 2019 CapsE Oct 28, 2019
LICENSE CapsE Dec 15, 2019 Update Dec 15, 2019 Update CapsE Nov 27, 2018 CapsE Jun 12, 2019 CapsE Nov 3, 2018
capse_logo.png CapsE Dec 14, 2019 Update CapsE Nov 27, 2018 CapsE Nov 3, 2018 CapsE Jun 4, 2019 Update Nov 12, 2018 CapsE Aug 11, 2019

A Capsule Network-based Embedding Model for Knowledge Graph Completion and Search PersonalizationTwitter

GitHub top languageGitHub repo sizeGitHub last commitGitHub forksGitHub starsGitHub

This program provides the implementation of the capsule network-based model CapsE as described in the paper:

      author={Dai Quoc Nguyen and Thanh Vu and Tu Dinh Nguyen and Dat Quoc Nguyen and Dinh Phung},
      title={{A Capsule Network-based Embedding Model for Knowledge Graph Completion and Search Personalization}},
      booktitle={Proceedings of the 2019 Annual Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies (NAACL-HLT)},



  • Python 3
  • Tensorflow >= 1.6


To run the program:

    $ python --embedding_dim <int> --num_filters <int> --learning_rate <float> --name <dataset_name> [--useConstantInit] --model_name <name_of_saved_model>

Required parameters:

--embedding_dim: Dimensionality of entity and relation embeddings.

--num_filters: Number of filters.

--learning_rate: Initial learning rate.

--name: Dataset name (WN18RR or FB15k-237).

--useConstantInit: Initialize filters by [0.1, 0.1, -0.1]. Otherwise, initialize filters by a truncated normal distribution.

--model_name: Name of saved models.

--num_epochs: Number of training epochs.

--vec_len_secondCaps: Number of neurons within the capsule in the second layer (Default: 10).

--run_folder: Specify directory path to save trained models.

--batch_size: Batch size.

To train a CapsE model to reproduce the experimental results published in the paper:

    $ python --embedding_dim 100 --num_epochs 31 --num_filters 50 --learning_rate 0.0001 --name FB15k-237 --useConstantInit --savedEpochs 30 --model_name fb15k237
    $ python --embedding_dim 100 --num_epochs 31 --num_filters 400 --learning_rate 0.00001 --name WN18RR --savedEpochs 30 --model_name wn18rr

Evaluation metrics

File provides ranking-based scores as evaluation metrics, including mean rank (MR), mean reciprocal rank (MRR), Hits@1 and Hits@10 in a setting protocol "Filtered".

See examples in Depending on the memory resources, you should change the values of --num_splits to a suitable value to get a faster process. To get the results (supposing num_splits = 8):

    $ python --embedding_dim 100 --num_filters 50 --name FB15k-237 --useConstantInit --model_index 30 --model_name fb15k237 --num_splits 8 --decode
    $ python --embedding_dim 100 --num_filters 400 --name WN18RR --model_index 30 --model_name wn18rr --num_splits 8 --decode


As in an agreement, you "MUST" cite the paper Search Personalization with Embeddings whenever SEARCH17 is used to produce your published results. Unzip in the data/SEARCH17 folder.

At the moment, I cannot release the text because of the privacy issues.


Please cite the paper whenever CapsE is used to produce published results or incorporated into other software. As a free open-source implementation, CapsE is distributed on an "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. All other warranties including, but not limited to, merchantability and fitness for purpose, whether express, implied, or arising by operation of law, course of dealing, or trade usage are hereby disclaimed. I believe that the programs compute what I claim they compute, but I do not guarantee this. The programs may be poorly and inconsistently documented and may contain undocumented components, features or modifications. I make no guarantee that these programs will be suitable for any application.

CapsE is is licensed under the Apache License 2.0.


I would like to thank Huadong Liao naturomics for implementing the CapsNet(Capsules Net) in Hinton's paper Dynamic Routing Between Capsules.

You can’t perform that action at this time.