Skip to content

wang9702/CasRel

Repository files navigation

CasRel-pytorch-reimplement

Pytorch reimplement of the paper "A Novel Cascade Binary Tagging Framework for Relational Triple Extraction" ACL2020. The original code was written in keras.

Requirements

  • keras-bert
  • tensorflow-gpu
  • transformers

Dataset

  • CMED: CHIP-2020 中文医学文本实体关系抽取

Usage

  1. Get the pre-trained Chinese BERT model

    1. Download the vocab.txt of BERT-wwm
    2. Get the pre-trained BERT cache
    from transformers import *
    model = BertModel.from_pretrained("hfl/chinese-bert-wwm")
    

    p.s. I use the chinese-bert-wwm here. You can also choose other pre-trained models like this

    p.p.s. the bert cache usually will be /home/your user name/.cache/torch/transformers

  2. Train the model

    python train.py
    
  3. Test the model

    python test.py
    

Results

CasRel-keras: test F1 45.59

CasRel-pytorch: test F1 47.59

About

层叠指针方法联合抽取实体关系

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages