You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
aligning Freebase relations with the New York Times corpus
Basic Concepts
Word Embeddings
Positional Embeddings: The idea is that words closer to the target entities usually contain more useful information regarding the relation class. 离实体越近的单词,包含关于relation的信息也多。所以通过embedding来获取这些信息。
Convolutional Neural Networks: capture ngram level features
5.1 Piecewise Convolutional Neural Networks (Zeng et al., 2015)
构建一个神经网络模型,从distant supervision data中构建一个relation extractor。网络模型和上面4.3,4.2的差不多,但是一个最大的贡献是使用了piecewise max-pooling across the sentence. 之前4.3的模型是在整个句子上做了max-pooling,这样会浪费大量信息。因为有两个entity,所以一个句子可以分为3个segment,然后分别在这3个segment上做max-pooling,这样能保留一些有用的信息。
5.2 Selective Attention over Instances (Lin et al., 2016)
为了解决5.1中只使用最相关document的问题,这篇论文通过attention机制对一个bag的所有document进行处理。Then the final vector representation for the bag of sentences is found by taking an attention-weighted average of all the sentence vectors (r i j , j = 1, 2...q i ) in the bag.
效果币PCNN好
5.3 Multi-instance Multi-label CNNs (Jiang et al., 2016)
解决了5.1的信息丢失问题 by using a crossdocument max-pooling layer. 对bag中的每一个句子做一个向量表示。Then the final vector representation for the bag of sentences is found by taking a dimension wise max of the sentence vectors (r i j , j = 1, 2...q i ). 最终的bag vector是针对每个维度,从所有的句子向量的相同维度中,找出数字最大的那个。
BrambleXu
changed the title
A Survey of Deep Learning Methods for Relation Extraction-2017
A Survey of Deep Learning Methods for Relation Extraction (2017)
Feb 21, 2019
BrambleXu
changed the title
A Survey of Deep Learning Methods for Relation Extraction (2017)
CoRR(J)-2017-A Survey of Deep Learning Methods for Relation Extraction
Apr 10, 2019
总结
数据
Basic Concepts
Supervised learning with CNNs
早期DL在RE方面的研究就是把RE当做了一个多分类问题。
Multi-instance learning models with distant supervision
把问题变为Multi-instance learning问题,这样我们可以通过远程监督来构建更大的训练集。Multi-instance learning是distant supervision的一种,含义是一个label有一群instance,而不是单单的一个instance。
在RE这个问题上,每个entity pair定义一个bag,这个bag包含涉及到entity pair的所有句子。然后我们把一个relation label标记给整个bag,而不是单单一个instance。
5.1 Piecewise Convolutional Neural Networks (Zeng et al., 2015)
5.2 Selective Attention over Instances (Lin et al., 2016)
效果币PCNN好
5.3 Multi-instance Multi-label CNNs (Jiang et al., 2016)
Results
深度模型普遍比不深的好。attention + PCNN是效果最好的。奇怪的是没有LSTM在RE方面的工作。
下一篇论文
Relation Extraction : A Survey
The text was updated successfully, but these errors were encountered: