Skip to content

ngl567/LCGE

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

35 Commits
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

LCGE

AAAI 2023: Logic and Commonsense-Guided Temporal Knowledge Graph Completion

Introduction

This is the PyTorch implementation of the LCGE framework. We propose a Logic and Commonsense-Guided Embedding model (LCGE) to jointly learn the time-sensitive representation involving timeliness and causality of events, together with the time-independent representation of events from the perspective of commonsense. Specifically, we design a temporal rule learning algorithm to construct a rule-guided predicate embedding regularization strategy for learning the causality among events. Furthermore, we could accurately evaluate the plausibility of events via auxiliary commonsense knowledge.

An Overview of the LCGE Framework

image

Appendix

We provide the Appendix of our original paper published on AAAI 2023, containing Description of the Temporal Rule Patterns, Evaluation Criteria of Length-2 Temporal Rules, the Algorithm of Our Proposed Temporal Rule Learning Module and Proof of Representing Causality Among Events via RGPR Mechanism.

Installation

Create a conda environment with pytorch and scikit-learn:

conda create --name lcge_env python=3.7
source activate lcge_env
conda install --file requirements.txt -c pytorch

Then install the lcge package to this environment:

python setup.py install

Datasets

Process the datasets and add them to the package data folder by running :

cd lcge
python process_icews.py
python process_wikidata12k.py

The global static KG could be constructed by staticgraph_icews.ipynb in the ./src_data/static_data folder.

To generate the static rules, you could employ the tool of AMIE+ amie_plus.jar in the folder ./src_data/rulelearning.

The temporal rules could be generated by temporal_rule_learning_icews14.ipynb and temporal_rule_learning_icews15.ipynb in the ./src_data/rulelearning folder.

triples.tsv: the generated global static KG.
rule1_p1.json and rule1_p2.json: the length-1 temporal rules.
rule2_p1.txt, rule2_p2.txt, rule2_p3.txt and rule2_p4.txt: the length-2 temporal rules.

Train and Test

In order to reproduce the results of LCGE model on the datasets, you can kindly run the following commands:
ICEWS14:

python learner_lcge.py --dataset ICEWS14 --model LCGE --rank 2000 --emb_reg 0.005 --time_reg 0.01 --rule_reg 0.01 --max_epoch 1000 --weight_static 0.1 --learning_rate 0.1

ICEWS05-15:

python learner_lcge.py --dataset ICEWS05-15 --model LCGE --rank 2000 --emb_reg 0.0025 --time_reg 0.05 --rule_reg 1.0 --max_epoch 1000 --weight_static 0.1 --learning_rate 0.1

Wikidata12k:

python learner_cs.py --dataset wikidata12k --model LCGE --rank 2000 --emb_reg 0.2 --time_reg 0.5 --max_epoch 500 --weight_static 0.1 --learning_rate 0.1

Notes

We have discovered some issues with the previous version of the code. We apologize for any inconvenience caused by these errors and took immediate steps to rectify the situation by rerunning the experiments with corrected code and updated the results in the current version of our paper. The updated experimental results indicate that our proposed model is still effective and outperforms all baseline approaches.

Citation

If you use the codes, please cite the following paper:

@inproceedings{niu2023lcge,
  author    = {Guanglin Niu and
               Bo Li},
  title     = {Logic and Commonsense-Guided Temporal Knowledge Graph Completion},
  booktitle = {AAAI},
  year      = {2023}
}

About

AAAI 2023: Logic and Commonsense-Guided Temporal Knowledge Graph Completion

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published