This is the official repository for "Open Relation Extraction via Query-based Span Prediction" by Huifan Yang, Da-Wei Li, Zekun Li, Donglin Yang, Jinsheng Qi and Bin Wu. The video talk and slide are available. Please cite & star this work if it is useful to you.
- We propose a novel query-based open relation extractor QORE that utilizes a Transformers-based language model to derive a representation of the interaction between the arguments and context.
- We carry out extensive experiments on seven datasets covering four languages, showing that QORE models significantly outperform conventional rule-based systems and the state-of-the-art method LOREM.
- Considering the practical challenges of ORE, we investigate the zero-shot domain transferability and the few-shot learning ability of QORE. The experimental results illustrate that our models maintain high precisions when transferring or training on fewer data.
The implementation of QORE model and the used experimental datasets have been integrated into the repo of QuORE.