NACCL-2018-Simultaneously Self-attending to All Mentions for Full-Abstract Biological Relation Extraction #33
Labels
C
Code Implementation
MTL(M)
Multi-Task/Joint Learning Model
RE(T)
Relation Extraction Task
Transformer(M)
Transformer Based Model
一句话总结:
利用CNN和Transformer构建的模型,学习长文本里的entity之间的关系。
用了transformer。jointly solve for NER and RE tasks using cross-entropy loss.
资源:
关键字:
笔记:
问题背景:一般的RE是在一个很短的句子上,并且句子里只有一对entity pair,预测是否存在relation。但是这种方法无视了不同mentions之间的内在联系,也无视了不同entity在不同句子中的关系(有点像是指代的问题)。
比如在Biocreative V CDR dataset这个数据集里,30%以上的relaiton,都是across sentnce boundaries.
尽管二者没有出现在一个句子里,但是我们还是能看出来,azathioprine can cause the side effect fibrosis。
To facilitate efficient full-abstract relation ex-traction from biological text, we propose Bi-affine Relation Attention Networks (BRANs), a combi-nation of network architecture, multi-instance and multi-task learning designed to extract relations be-tween entities in biological text without requiring explicit mention-level annotation.
BRANs:NN网络,multi-instance and multi-task learning
We synthesize convolutions and self-attention, a modification of the Transformer encoder introduced by Vaswani et al. (2017 ), over sub-word tokens to efficiently incorporate into token representations rich context between distant mention pairs across the entire ab-stract.
把CNN和一个Transformaer的变种结合起来,并在sub-word level上学习embedding。
模型图:
结果:
后续研究: #99
The text was updated successfully, but these errors were encountered: