Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

NACCL-2018-Simultaneously Self-attending to All Mentions for Full-Abstract Biological Relation Extraction #33

Open
BrambleXu opened this issue Mar 15, 2019 · 0 comments
Assignees
Labels
C Code Implementation MTL(M) Multi-Task/Joint Learning Model RE(T) Relation Extraction Task Transformer(M) Transformer Based Model

Comments

@BrambleXu
Copy link
Owner

BrambleXu commented Mar 15, 2019

一句话总结:

利用CNN和Transformer构建的模型,学习长文本里的entity之间的关系。

用了transformer。jointly solve for NER and RE tasks using cross-entropy loss.

资源:

  • pdf
  • code : python version 2.7 tensorflow version 1.0.1

关键字:

  • dataset: The Biocreative V chemical disease relation extraction (CDR) dataset, Chemical Protein Relations Dataset (CPR), The Comparative Toxicogenomics Database (CTD)
  • bi-affine relation attention network

笔记:

问题背景:一般的RE是在一个很短的句子上,并且句子里只有一对entity pair,预测是否存在relation。但是这种方法无视了不同mentions之间的内在联系,也无视了不同entity在不同句子中的关系(有点像是指代的问题)。

比如在Biocreative V CDR dataset这个数据集里,30%以上的relaiton,都是across sentnce boundaries.

image-20190315113200263

尽管二者没有出现在一个句子里,但是我们还是能看出来,azathioprine can cause the side effect fibrosis。

To facilitate efficient full-abstract relation ex-traction from biological text, we propose Bi-affine Relation Attention Networks (BRANs), a combi-nation of network architecture, multi-instance and multi-task learning designed to extract relations be-tween entities in biological text without requiring explicit mention-level annotation.

BRANs:NN网络,multi-instance and multi-task learning

We synthesize convolutions and self-attention, a modification of the Transformer encoder introduced by Vaswani et al. (2017 ), over sub-word tokens to efficiently incorporate into token representations rich context between distant mention pairs across the entire ab-stract.

把CNN和一个Transformaer的变种结合起来,并在sub-word level上学习embedding。

模型图:

结果

后续研究: #99

@BrambleXu BrambleXu added the RE(T) Relation Extraction Task label Mar 15, 2019
@BrambleXu BrambleXu self-assigned this Mar 15, 2019
@BrambleXu BrambleXu added C Code Implementation MTL(M) Multi-Task/Joint Learning Model labels Mar 17, 2019
@BrambleXu BrambleXu added BERT(M) BERT Model MTL(M) Multi-Task/Joint Learning Model and removed MTL(M) Multi-Task/Joint Learning Model labels Apr 9, 2019
@BrambleXu BrambleXu added Transformer(M) Transformer Based Model and removed BERT(M) BERT Model labels May 20, 2019
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
C Code Implementation MTL(M) Multi-Task/Joint Learning Model RE(T) Relation Extraction Task Transformer(M) Transformer Based Model
Projects
None yet
Development

No branches or pull requests

1 participant