Skip to content


Folders and files

Last commit message
Last commit date

Latest commit



1 Commits

Repository files navigation

GATE: Graph Attention Transformer Encoder

Official implementation of our AAAI 2021 paper on Cross-lingual Relation and Event Extraction. [arxiv]


  • We perform evaluation using the ACE 2005 dataset (3 languages - English, Arabic, and Chinese).
  • We perform zero-shot relation extraction and event-argument role labeling.
  • We consider both single-source and multi-source transfer setting.
  • We implement three baseline methods for evaluation since their implementation is not publicly available.


To train and test a specific model, go to the scripts folder and run the bash files under the model directory. For example, to train and test our GATE model, do the following.

$ cd  scripts/gate
$ bash gpu_id model_name

Here, model_name is a string that will be used to name a directory under tmp/ directory.

Once training/testing is finished, inside the tmp/model_name/ directory, 30 files will appear. The filenames are formatted as follows, where, "src" and "tgt" are from ['en', 'ar', 'zh'].

There is a python script that will read the log files to report the final results in the console as follows.

|         |      English      |       Arabic      |      Chinese      |
| English | 64.18/66.74/65.44 | 60.87/36.77/45.84 | 61.89/47.71/53.88 |
| Arabic  | 40.31/51.14/45.08 | 68.77/72.53/70.60 | 50.07/48.11/49.07 |
| Chinese | 45.01/48.75/46.80 | 59.54/46.67/52.32 | 71.55/78.98/75.08 |

Running experiments on CPU/GPU/Multi-GPU

  • If gpu_id is set to -1, CPU will be used.
  • If gpu_id is set to one specific number, only one GPU will be used.
  • If gpu_id is set to multiple numbers (e.g., 0,1,2), then parallel computing will be used.


We borrowed and modified code from DrQA, OpenNMT, and Transformers. We expresse our gratitdue for the authors of these repositeries.


    author = {Ahmad, Wasi Uddin and Peng, Nanyun and Chang, Kai-Wei},
    booktitle = {Proceedings of the AAAI Conference on Artificial Intelligence},
    title = {GATE: Graph Attention Transformer Encoder for Cross-lingual Relation and Event Extraction},
    year = {2021}


Official implementation of our work, GATE: Graph Attention Transformer Encoder for Cross-lingual Relation and Event Extraction [AAAI 2021].







No releases published


No packages published