This repository provides PyTorch implementations of RAKGE as described in the paper: Exploiting Relation-aware Attribute Representation Learning in Knowledge Graph Embedding for Numerical Reasoning (KDD 2023).
If you want to mention RAKGE for your research, please consider citing the following paper:
@inproceedings{RAKGE,
author = {Kim, Gayeong and Kim, Sookyung and Kim, Ko Keun and Park, Suchan and Jung, Heesoo and Park, Hogun},
title = {Exploiting Relation-aware Attribute Representation Learning in Knowledge Graph Embedding for Numerical Reasoning},
booktitle = {Proceedings of the 29th ACM SIGKDD Conference on Knowledge Discovery and Data Mining},
year = {2023}
}
- python 3.7+
- torch 1.9+
- dgl 0.7+
Download the datasets from the following link: https://drive.google.com/drive/folders/1ecUQvVTSDqUdY3_PVDeNUvg5LDaTwxZ7?usp=sharing
and save them in the '/dataset' directory.
To preprocess their numeric attributes, please execute the following command:
python preprocess_kg_num_lit.py --dataset {credit, spotify, US-cities}
Now you are ready to train and evaluate RAKGE and other baselines. To reproduce the results provided in the paper, please execute the corresponding command for each model as follows:
python run.py --gpu 0 --n_layer 0 --literal --init_dim 200 --att_dim 200 --head_num 5 --name RAKGE --scale 0.25 --order 0.25 --data {credit, spotify, US-cities} --drop 0.7
python run.py --gpu 0 --n_layer 0 --init_dim 200 --name lte --score_func transe --opn mult --x_ops "d" --hid_drop 0.7 --data {credit, spotify, US-cities}
python run.py --gpu 0 --n_layer 0 --literal --init_dim 200 --name TransELiteral_gate --data {credit, spotify, US-cities} --input_drop 0.7
python run.py --gpu 0 --n_layer 1 --score_func transe --opn mult --gcn_dim 150 --init_dim 150 --num_base 5 --encoder rgcn --name repro --data {credit, spotify, US-cities} --hid_drop 0.7
Please send any questions you might have about the code and/or the algorithm to gayeongkim@o365.skku.edu.
We refer to the code of LTE-KGC and LiteralE. Thanks for their contributions.