Skip to content
/ RSMAN Public

Source code for NAACL 2022 paper: Relation-Specific Attentions over Entity Mentions for Enhanced Document-Level Relation Extraction

License

Notifications You must be signed in to change notification settings

FDUyjx/RSMAN

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

36 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

RSMAN

Source code for NAACL 2022 paper: Relation-Specific Attentions over Entity Mentions for Enhanced Document-Level Relation Extraction

Introduction

This is the implementation of the RSMAN (Relation-Specific Mention Attention Network) with SSAN as the backbone model. RSMAN is easy to be plugged into other different backbone models to enhance them and here we take SSAN for example. Part of the code is borrowed from https://github.com/BenfengXu/SSAN, and we really appreciate it.

Requirements

  • python==3.6
  • pytorch==1.4.0
  • transformers==2.7.0

Dataset

  • DocRED
  • DWIE
  • Note that you should process DWIE to fit the same format as DocRED. Put the dataset into the directory ./data.

Train

Download pre-trained language models into the directory ./pretrained_lm and run:

python run.py

The evaluation on dev set will be run during training at each logging step, and the trained model corresponding to the best dev result will be saved into the directory ./checkpoints.

Test

To get the result on test set, run:

python run.py --do_train False --do_predict

Then a test result file in the official evaluation format will be saved as ./checkpoints/result.json. Compress and submit it to CodaLab to get the final test score.

We also provide our trained model and test result file, you can download them from here.

About

Source code for NAACL 2022 paper: Relation-Specific Attentions over Entity Mentions for Enhanced Document-Level Relation Extraction

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages