Skip to content

beaverbee/CrossAttention

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

13 Commits
 
 
 
 
 
 

Repository files navigation

CrossAttention

Relation Extraction Based on CrossAttention Neural Network

This repository contain the NYT and WebNLG datasets for the paper "Relation Extraction Based on CrossAttention Neural Network"

Overview

The original NYT dataset is proposed by Sebastian Riedel, Limin Yao, and Andrew McCallum for the paper "Modeling relations and their mentions without labeled text". The paper is available at https://link.springer.com/content/pdf/10.1007/978-3-642-15939-8_10.pdf
The original WebNLG dataset is proposed by Claire Gardent, Anastasia Shimorina, Shashi Narayan, and Laura Perez-Beltrachini for the paper "Creating training corpora for nlg micro-planners".The paper is available at https://aclanthology.org/P17-1017.pdf. The original WebNLG dataset is available at https://gitlab.com/shimorina/webnlg-dataset
the modified NYT and WebNLG datasets are available at https://github.com/weizhepei/CasRel which are collected and uploaded by Zhepei Wei, Jianlin Su, Yue Wang, Yuan Tian, and Yi Chang. The paper "A Novel Cascade Binary Tagging Framework for Relational Triple Extraction" is available at https://aclanthology.org/2020.acl-main.136.pdf
Thanks for the above authors' contribution. We uploaded the two public datasets in JSON format. Moreover, we split the test set of two datasets based on the overlap problem, number of triplets, and sentence length and uploaded the result.

Usage

We will upload the source code after code review

reference

  1. G. O. Young, “Synthetic structure of industrial plastics,” in Plastics, 2nd ed., vol. 3, J. Peters, Ed. New York, NY, USA: McGraw-Hill, 1964, pp. 15–64.
  1. Yee Seng Chan and Dan Roth. 2011. Exploiting syntactico-semantic structures for relation extraction. In Proceedings of the 49th Annual Meeting of the Association for Computational Linguistics: Human Language Technologies-Volume 1, pages 551–560. Association for Computational Linguistics.
  1. Dmitry Zelenko, Chinatsu Aone, and Anthony Richardella. 2003. Kernel methods for relation extraction. Journal of ma-chine learning research, 3(Feb):1083–1106.
  1. Qi Li and Heng Ji. 2014. Incremental joint extraction of entity mentions and relations. In Proceedings of the 52nd Annual Meeting of the Association for Computational Linguistics (Volume 1: Long Papers), volume 1, pages 402–412.
  1. Xiang Ren, Zeqiu Wu, Wenqi He, Meng Qu, Clare R Voss, Heng Ji, Tarek F Abdelzaher, and Jiawei Han. 2017. Cotype: Joint extraction of typed entities and relations with knowledge bases. In Proceedings of the 26th International Conference on World Wide Web, pages 1015–1024.
  1. Arzoo Katiyar and Claire Cardie. 2017. Going out on a limb: Joint extraction of entity mentions and relations without dependency trees. In Proceedings of the 55th Annual Meeting of the Association for Computational Linguistics (Volume 1: Long Papers), pages 917–928.
  1. Xiaofeng Yu and Wai Lam. 2010. Jointly identifying entities and extracting relations in encyclopedia text via a graphical model approach. In Proceedings of the 23rd International Conference on Computational Linguistics: Posters, pages 1399–1407. Association for Computational Linguistics.
  1. Xiangrong Zeng, Daojian Zeng, Shizhu He, Kang Liu and Jun Zhao. 2018. Extracting relational facts by an end to end neural model with copy mechanism. n Proceedings of the 56th Annual Meeting of the Association for Computational Linguistics (Volume 1: Long Papers), volume 1, pages 506-514.
  1. Ye, H.; Zhang, N.; Deng, S.; Chen, M.; Tan, C.; Huang, F.; Chen, H. Contrastive Triple Extraction with Generative Transformer. In Proceedings of the Thirty-Fifth AAAI Conference on Artificial Intelligence, AAAI, Virtually, 2–9 February 2021; pp. 14257–14265.
  1. Tsu-Jui Fu, Peng-Hsuan Li, and Wei-Yun Ma. 2019. Graphrel: Modeling text as relational graphs for joint entity and rela-tion extraction. In Proceedings of the 57th Annual Meeting of the Association for Computational Linguistics, pages 1409–1418.
  1. Z. Wei, J. Su, Y. Wang, Y. Tian, Y. Chang, A novel cascade binary tagging framework for relational triple extraction, in: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, 2020, pp. 1476–1488.
  1. Sun, K.; Zhang, R.; Mensah, S.; Mao, Y.; Liu, X. Progressive multitask learning with controlled information flow for joint entity and relation extraction. Assoc. Adv. Artif. Intell. 2021, 35, 13851–13859.
  1. G. Brauwers and F. Frasincar, "A General Survey on Attention Mechanisms in Deep Learning," in IEEE Transactions on Knowledge and Data Engineering, doi: 10.1109/TKDE.2021.3126456.
  1. Lai, T., Cheng, L., Wang, D. et al. RMAN: Relational multi-head attention neural network for joint extraction of entities and relations. Appl Intell 52, 3132–3142 (2022). https://doi.org/10.1007/s10489-021-02600-2
  1. Suncong Zheng, Feng Wang, Hongyun Bao, Yuexing Hao, Peng Zhou, and Bo Xu. 2017. Joint extraction of entities and relations based on a novel tagging scheme. In Proceedings of the 55th Annual Meeting of the Association for Computa-tional Linguistics (Volume 1: Long Papers), volume 1, pages 1227–1236.
  1. M. E. Peters, M. Neumann, M. Iyyer, M. Gardner, C. Clark, K. Lee, and L. Zettlemoyer, ‘‘Deep contextualized word rep-resentations,’’ in Proc. Conf. North Amer. Chapter Assoc. Comput. Linguistics, Hum. Lang. Technol., vol. 1, 2018, pp. 2227–2237.
  1. Jacob Devlin, Ming-Wei Chang, Kenton Lee, and Kristina Toutanova. 2019. Bert: Pre-training of deep bidirectional transformers for language understanding. In Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pages 4171–4186
  1. Weipeng Huang, Xingyi Cheng, Taifeng Wang, Wei Chu (2019) BERT-Based Multi-head Selection for Joint Enti-ty-Relation Extraction. In: Tang J., Kan MY., Zhao D., Li S., Zan H. (eds) Natural Language Processing and Chinese Computing. NLPCC 2019. Lecture Notes in Computer Science, vol 11839. Springer, Cham. https://doi.org/10.1007/978-3-030-32236-6_65
  1. Wadden, David, Wennberg, Ulme, Luan, Yi, Hajishirzi, Hannaneh. (2019). Entity, Relation, and Event Extraction with Contextualized Span Representations. 5788-5793. doi: 10.18653/v1/D19-1585.
  1. Ashish Vaswani, Noam Shazeer, Niki Parmar, Jakob Uszkoreit, Llion Jones, Aidan N Gomez, Łukasz Kaiser, and Illia Polosukhin. 2017. Attention is all you need. In Advances in neural information processing systems, pages 5998–6008.
  1. Peixiang Zhong, Di Wang, and Chunyan Miao. 2019. Knowledge-enriched transformer for emotion detection in textual conversations. In Proceedings of the 2019 Conference on Empirical Methods in Natural Language Processing and the 9th International Joint Conference on Natural Language Processing (EMNLP-IJCNLP), pages 165–176.
  1. Jonas Gehring, Michael Auli, David Grangier, Denis Yarats, and Yann N. Dauphin. 2017. Convolutional sequence to sequence learning. In Proceedings of the 34th International Conference on Machine Learning (Proceedings of Machine Learning Research), Doina Precup and Yee Whye Teh (Eds.), Vol. 70. PMLR, 1243–1252.
  1. F. Yu and V. Koltun, “Multi-Scale Context Aggregation by Dilated Convolutions,” in Proceedings of the International Conference on Learning Representations (ICLR), 2016, pp. 1–13. Benaim, S. and Wolf, L., "One-sided
  1. Yann N. Dauphin, Angela Fan, Michael Auli, and David Grangier. 2017. Language modeling with gated convolutional networks. In Proceedings of the 34th International Conference on Machine Learning-Volume 70. JMLR.org, 933–941.
  1. Minh-Thang Luong, Hieu Pham, and Christopher D Manning. 2015a. Effective approaches to attention-based neural machine translation. arXiv preprint arXiv:1508.04025 (2015).
  1. K. He, X. Zhang, S. Ren and J. Sun, "Deep Residual Learning for Image Recognition," in 2016 IEEE Conference on Com-puter Vision and Pattern Recognition (CVPR), Las Vegas, NV, USA, 2016 pp. 770-778. doi: 10.1109/CVPR.2016.90
  1. Jimmy Lei Ba, Jamie Ryan Kiros, and Geoffrey E Hinton. Layer normalization. arXiv preprint arXiv:1607.06450, 2016
  1. Diederik P Kingma and Jimmy Ba. 2014. Adam: A method for stochastic optimization. arXiv preprint arXiv:1412.6980.
  1. Sebastian Riedel, Limin Yao, and Andrew McCallum. 2010. Modeling relations and their mentions without labeled text. In Joint European Conference on Machine Learning and Knowledge Discovery in Databases, pages 148–163.
  1. Claire Gardent, Anastasia Shimorina, Shashi Narayan, and Laura Perez-Beltrachini. 2017. Creating training corpora for nlg micro-planners. In Proceedings of the 55th Annual Meeting of the Association for Computational Linguistics (Vol-ume 1: Long Papers), pages 179–188.
  1. Xiangrong Zeng, Shizhu He, Daojian Zeng, Kang Liu, Shengping Liu, and Jun Zhao. 2019. Learning the extraction order of multiple relational facts in a sentence with reinforcement learning. In Proceedings of the 2019 Conference on Empiri-cal Methods in Natural Language Processing and the 9th International Joint Conference on Natural Language Pro-cessing (EMNLP-IJCNLP), pages 367–377.

About

Relation Extraction Based on CrossAttention Neural Network

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published