Skip to content
/ DocED Public

[ACL 2021] MLBiNet: A Cross-Sentence Collective Event Detection Network

Notifications You must be signed in to change notification settings

zjunlp/DocED

Repository files navigation

DocED

This repository is the official implementation of the ACL 2021 paper MLBiNet: A Cross-Sentence Collective Event Detection Network.

Requirements

To install basic requirements:

pip install requirements.txt

Datasets

ACE2005 can be found here: https://catalog.ldc.upenn.edu/LDC2006T06

Basic training

To evaluate a setting with serveral random trials, execute

python run_experiments_multi.py

Main hyperparameters in train_MLBiNet.py include:

--tagging_mechanism, mechanism to model event inter-dependency, you can choose one of "forward_decoder", "backward_decoder" or "bidirectional_decoder"

--num_tag_layers, number of tagging layers, 1 indicates that we do sentence-level ED, 2 indicates that information of adjacent sentences were aggregated, ...

--max_doc_len, maximum number of consecutive sentences are extracted as a mini-document, we can set it as 8 or 16

--tag_dim, dimension of an uni-directional event tagging vector

--self_att_not, whether to apply self-attention mechanism in sentence encoder

Main results

Overall performance on ACE2005

image

Performance on detecting multiple events collectively

image

where 1/1 means one sentence that has one event; otherwise, 1/n is used.

Performance of our proposed method with different multi-layer settings or decoder methods

image

How to Cite

@inproceedings{ACL2021_MLBiNet,
  author    = {Dongfang Lou and
               Zhilin Liao and
               Shumin Deng and
               Ningyu Zhang and
               Huajun Chen},
  title     = {MLBiNet: A Cross-Sentence Collective Event Detection Network},
  booktitle = {{ACL}},
  publisher = {Association for Computational Linguistics},
  year      = {2021}
}

About

[ACL 2021] MLBiNet: A Cross-Sentence Collective Event Detection Network

Topics

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages