Skip to content

TREMA-UNH/neural-entity-context-models

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

13 Commits
 
 
 
 
 
 
 
 
 
 

Repository files navigation

neural-entity-context-models

This repository is for the paper Neural Entity Context Models accepted to IJCKG 2023.

Abstract

A prevalent approach of entity-oriented systems involves retrieving relevant entities by harnessing knowledge graph embeddings. These embeddings encode entity information in the context of the knowledge graph and are static in nature. Our goal is to generate entity embeddings that capture what renders them relevant for the query. This differs from entity embeddings constructed with static resource, for example, E-BERT. Previously, Dalton [13] demonstrated the benefits obtained with the Entity Context Model, a pseudo-relevance feedback approach based on entity links in relevant contexts. In this work, we reinvent the Entity Context Model (ECM) for neural graph networks and incorporate pre-trained embeddings. We introduce three entity ranking models based on fundamental principles of ECM: (1) Graph Attention Networks, (2) Simple Graph Relevance Networks, and (3) Graph Relevance Networks. Graph Attention Networks and Graph Relevance Networks are the graph neural variants of ECM, that employ attention mechanism and relevance information of the relevant context respectively to ascertain entity relevance. One of our main contributions is improving over the attention mechanism using relevance information. Our experiments demonstrate that our neural variants of the ECM model significantly outperform the state-of-the-art BERT-ER by more than 14% and exceeds the performance of systems that use knowledge graph embeddings. Notably, our findings reveal that leveraging the relevance of the relevant context is more effective at identifying relevant entities than the attention mechanism. To evaluate the efficacy of the models, we conduct experiments on two standard benchmark datasets, DBpediaV2 and TREC Complex Answer Retrieval. To aid reproducibility, our code and data are available.

Code contains the code of the Neural ECM models.

Runs contains the run files generated by the ECM models.

Graph Relevance Network

To train the GRN model, please run the following command:

python train.py --train $data/$mode/train.pairwise.jsonl --save-dir $data/$mode --dev $data/$mode/test.jsonl --save GRN.bin --run $data/$mode/test.run --query-in-emb 50 --ent-in-emb 100 --para-in-emb 50 --model-type pairwise --use-cuda --cuda $cuda --epoch 50 --batch-size 1000 --seed 91453 --layer-flag 1

Graph Attention Network

To train the GAT model, please run the following command:

python train.py --train $data/$mode/train.pairwise.jsonl --save-dir $data/$mode --dev $data/$mode/test.jsonl --save GAT.bin --run $data/$mode/test.run --query-in-emb 50 --ent-in-emb 100 --para-in-emb 50 --model-type pairwise --use-cuda --cuda $cuda --epoch 50 --batch-size 1000 --seed 91453 --layer-flag 2

Special-GRN

To train the Special-GRN model, please run the following command:

python train.py --train $data/$mode/train.pairwise.jsonl --save-dir $data/$mode --dev $data/$mode/test.jsonl --save GRN-ECM.bin --run $data/$mode/test.run --query-in-emb 1 --ent-in-emb 1 --para-in-emb 1 --model-type pairwise --use-cuda --cuda $cuda --epoch 50 --batch-size 1000 --seed 91453 --layer-flag 3

About

No description, website, or topics provided.

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages