Skip to content

763337092/LP-BERT

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

6 Commits
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

LP-BERT: Multi-task Pre-training Knowledge Graph BERT for Link Prediction

This repository provides evaluation codes of LP-BERT for link prediction task. The idea of PLNLP is described in the following article:

Multi-task Pre-training Language Model for Semantic Network Completion(https://dl.acm.org/doi/abs/10.1145/3627704) LP-BERT: Multi-task Pre-training Knowledge Graph BERT for Link Prediction (https://arxiv.org/pdf/xx.pdf)

Only with BERT-base level parameters, LP-BERT achieves top-1 performance on both WN18RR and UMLS datasets.

Environment

The code is implemented with PyTorch. Requirments:
 1. python=3.7
 2. pytorch=1.9.0+cu102
 3. transformers=4.2.1
 4. numpy=1.17.2
 5. pandas=0.25.1
 6. sklearn=0.21.3
 7. tqdm=4.52.0

Reproduction of performance on WN18RR, UMLS and FB15k-237

prepare data:

python make_concat_data.py

UMLS:

CUDA_VISIBLE_DEVICES=0,1,2,3 python run_umls_pretrain.py
CUDA_VISIBLE_DEVICES=0 python run_umls_finetune.py

WN18RR:

CUDA_VISIBLE_DEVICES=0,1,2 python run_wn18rr_pretrain.py
CUDA_VISIBLE_DEVICES=0 python run_wn18rr_finetune.py

FB15k-237:

CUDA_VISIBLE_DEVICES=0,1,2 python run_fb15k237_pretrain.py
CUDA_VISIBLE_DEVICES=0 python run_fb15k237_finetune.py

About

LP-BERT: Multi-task Pre-training Knowledge Graph BERT for Link Prediction

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published