Skip to content

snap-stanford/csr

Repository files navigation

Few-shot Relational Reasoning via Connection Subgraph Pretraining (NeurIPS 2022)

Paper: https://arxiv.org/abs/2210.06722

We propose Connection Subgraph Reasoner (CSR) to make predictions for the few-shot relational reasoning task directly by self-supervised pre-training over knowledge graphs.

main figure

Specifically, we design a self-supervised pretraining scheme with the objective of reconstructing automatically sampled connection subgraphs.

reconstruction

Requirements

To install requirements:

pip install -r requirements.txt

Download NELL, FB15K-237 and ConceptNet data, including both raw triplets and preprocessed data. Download embedding for all datasets. Alternatively, download from google drive. After download these data, please unzip them under the top level of this repo.

To replicate the preprocessing of the data from the raw triplets:

  1. Extract subgraphs using python graph_extractions/graph_sampler.py.
  2. Preprocess each dataset by running SubgraphFewshotDataset in load_kg_dataset.py with preprocess/preprocess_50negs = True.

See more detailed configurations and examples inside graph_extractions/graph_sampler.py.

Training

The main files are model.py that contains our models and trainer.py that contains our training code.

Usages:

To train CSR-GNN on NELL transductive setting:

python main.py --device 0 --wandb_name <wandb_name> --prefix CSR-NELL-GNN --dataset NELL --step pretrain --learning_rate 1e-5 --use_atten True  --coefficient 0.7 --coefficient2 0.1 --use_pretrain_node_emb True

To train CSR-GNN on NELL inductive setting:

python main.py --device 0 --wandb_name <wandb_name> --prefix CSR-NELL-inductive-GNN --dataset NELL --step pretrain --learning_rate 1e-5 --use_atten True  --coefficient 0.7 --coefficient2 0.1 --inductive True 
Commands for Other Datasets

To train CSR-GNN on FB15K-237 transductive setting:

python main.py --device 0 --wandb_name <wandb_name> --prefix CSR-FB-GNN --dataset FB15K-237 --step pretrain --learning_rate 1e-5 --use_atten True  --coefficient 0.1 --coefficient2 1 --use_pretrain_node_emb True

To train CSR-GNN on FB15K-237 inductive setting:

python main.py --device 0 --wandb_name <wandb_name> --prefix CSR-FB-inductive-GNN --dataset FB15K-237 --step pretrain --learning_rate 1e-5 --use_atten True  --coefficient 2 --coefficient2 2 --inductive True 

To train CSR-GNN on ConceptNet transductive setting:

python main.py --device 0 --wandb_name <wandb_name> --prefix CSR-CN-GNN --dataset ConceptNet --step pretrain --learning_rate 1e-5 --use_atten True  --coefficient 1 --coefficient2 0.5 --use_pretrain_node_emb True --embed_model ComplEx

To train CSR-GNN on ConceptNet inductive setting:

python main.py --device 0 --wandb_name <wandb_name> --prefix CSR-CN-inductive-GNN --dataset ConceptNet --step pretrain --learning_rate 1e-5 --use_atten True  --coefficient 2 --coefficient2 0.5 --inductive True --embed_model ComplEx

Evaluation and Pre-trained Models

Pretrained models can be downloaded here. Bellow are example commands for evaluation with NELL datasets and NELL pretrained models:

CSR-GNN on NELL transductive setting:

python main.py --device 0 --wandb_name <wandb_name> --prefix CSR-NELL-GNN --dataset NELL --step test --use_atten True --use_pretrain_node_emb True --prev_state_dir checkpoints/CSR-NELL-GNN.pt

CSR-GNN on NELL inductive setting:

python main.py --device 0 --wandb_name <wandb_name> --prefix CSR-NELL-GNN --dataset NELL --step test --use_atten True --inductive True --prev_state_dir checkpoints/CSR-NELL-inductive-GNN.pt

CSR-OPT

CSR-OPT on NELL inductive setting with hyperparameter tuning:

python main.py --device 0 --wandb_name <wandb_name>  --prefix CSR-NELL-inductive-OPT --dataset NELL --step tune

CSR-OPT on NELL inductive setting with pre selected hyperparameters:

python main.py --device 0 --wandb_name <wandb_name>  --prefix CSR-NELL-inductive-OPT --dataset NELL --step opt_test

Results

Our model achieves the following performances:

Dataset Model name transductive MRR inductive MRR
NELL CSR-OPT 0.463 0.425
CSR-GNN 0.577 0.511
FB15K-237 CSR-OPT 0.619 0.554
CSR-GNN 0.781 0.624
ConceptNet CSR-OPT 0.559 0.547
CSR-GNN 0.606 0.611

See full results in our paper.

Citations

If you use this repo, please cite the following paper. This repo is mainly based on MetaR repo, and code in subgraph_extraction/ is largely borrowed from Grail.

@inproceedings{
 csr2022,
 title={Few-shot Relational Reasoning via Connection Subgraph Pretraining},
 author={Qian Huang, Hongyu Ren and Jure Leskovec},
 booktitle={Neural Information Processing Systems},
 year={2022}
}

About

No description, website, or topics provided.

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages