Skip to content

Latest commit

 

History

History

few-shot

Folders and files

NameName
Last commit message
Last commit date

parent directory

..
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Easy Start

English | 简体中文

Requirements

python == 3.8

  • torch == 1.5
  • transformers == 3.4.0
  • hydra-core == 1.0.6
  • deepke

Model

Few-shot relation extraction based on the WWW2022 paper”KnowPrompt: Knowledge-aware Prompt-tuning with Synergistic Optimization for Relation Extraction"

Download Code

git clone https://github.com/zjunlp/DeepKE.git
cd DeepKE/example/re/few-shot

Install with Pip

  • Create and enter the python virtual environment.
  • Install dependencies: pip install -r requirements.txt.

Train and Predict

  • Dataset

    • Download the dataset to this directory.

      wget 120.27.214.45/Data/re/few_shot/data.tar.gz
      tar -xzvf data.tar.gz
    • The dataset SEMEVAL is stored in data:

      • rel2id.json:Relation Labels / Answer words - ID

      • test.txt: Test set

      • train.txt: Training set

      • val.txt:Validation set

    • We also provide data augmentation methods to effectively leverage limited annotated RE data.

  • Training

    • Parameters, model paths and configuration for training are in the conf folder and users can modify them before training.

    • Few-shot training on SEMEVAL

      python run.py
    • The trained model is stored in the current directory by default.

    • Start to train from last-trained model

      modify train_from_saved_model in .yaml as the path of the last-trained model

    • Logs for training are stored in the current directory by default and the path can be configured by modifying log_dir in .yaml

  • Prediction

    python predict.py

Cite

If you use or extend our work, please cite the following paper:

@inproceedings{DBLP:conf/www/ChenZXDYTHSC22,
  author    = {Xiang Chen and
               Ningyu Zhang and
               Xin Xie and
               Shumin Deng and
               Yunzhi Yao and
               Chuanqi Tan and
               Fei Huang and
               Luo Si and
               Huajun Chen},
  editor    = {Fr{\'{e}}d{\'{e}}rique Laforest and
               Rapha{\"{e}}l Troncy and
               Elena Simperl and
               Deepak Agarwal and
               Aristides Gionis and
               Ivan Herman and
               Lionel M{\'{e}}dini},
  title     = {KnowPrompt: Knowledge-aware Prompt-tuning with Synergistic Optimization
               for Relation Extraction},
  booktitle = {{WWW} '22: The {ACM} Web Conference 2022, Virtual Event, Lyon, France,
               April 25 - 29, 2022},
  pages     = {2778--2788},
  publisher = {{ACM}},
  year      = {2022},
  url       = {https://doi.org/10.1145/3485447.3511998},
  doi       = {10.1145/3485447.3511998},
  timestamp = {Tue, 26 Apr 2022 16:02:09 +0200},
  biburl    = {https://dblp.org/rec/conf/www/ChenZXDYTHSC22.bib},
  bibsource = {dblp computer science bibliography, https://dblp.org}
}