This repo contains the code implementing the models described in Neural Methods for Logical Reasoning over Knowledge Graphs
In this paper, we focus on answering multi-hop logical queries on Knowledge Graphs (KGs). To this end, we have implemented the following models. We also include the original baseline models
Baselines
Models
- MLP: Multi-Layer Perceptron
- MLPMixer: Adpated from MLPMixer
Variants
- MLP + Heterogeneous Hyper-Graph Embeddings
- MLP + Attention Mechanism
- MLP + 2 Vector Average
How to use it
You can find some examples on how to execute the code can be found on examples.sh
Data
To evalute the models, we have used standard evaluation datasets (FB15k, FB15k-237, NELL995) as in the BetaE paper. It can be downloaded here.
Citations
If you use this repo, please cite the following paper.
@inproceedings{
amayuelas2022neural,
title={Neural Methods for Logical Reasoning over Knowledge Graphs},
author={Amayuelas, Alfonso and Zhang, Shuai and Rao, Xi Susie and Zhang, Ce},
booktitle={International Conference on Learning Representations},
year={2022}
}
Acknowledgements
This code is built on top of previous work from SNAP-Stanford. Check out their repo here