论文入选要求:
1. 好的方法 - 开创性或突破性方法
2. 好的应用 - 提供解决问题的全新视角
(Application: 商品标题压缩,资讯标题改写,PUSH消息改写...)
情感推理、偏见/观点差异检测
VADER Sentiment Analysis. VADER (Valence Aware Dictionary and sEntiment Reasoner) is a lexicon and rule-based sentiment analysis tool that is specifically attuned to sentiments expressed in social media, and works well on texts from other domains.
Bias Statement Detector (BSD) computationally detects and quantifies the degree of bias in sentence-level text of news stories.
WordNet
Word Embedding
Machine Translation
Fluency and Coherency
Deep Reinforcement Learning for NLP ACL 2018 ppt
2017 - ...
2018 - 神经网络、注意机制、表示学习、语义和知识...
-
表示学习
-
词汇表示学习
融合多种信息的、任务或领域特异的、跨语言、消歧的词汇表示学习
-
句子表示学习
-
-
...
-
The most important NLP highlights of 2018 (PDF Report)
The progress in Natural Language Processing (NLP), including the datasets and the current state-of-the-art for the most common NLP tasks.
https://github.com/sebastianruder/NLP-progress
https://github.com/Yale-LILY/Awesome-NLP-Research
https://github.com/Kyubyong/nlp_tasks
https://github.com/omarsar/nlp_overview https://nlpoverview.com/
https://aclweb.org/aclwiki/State_of_the_art
TutorialBank: Learning NLP Made Easier search engine
LectureBank: a corpus for NLP Education and Prerequisite Chain Learning
-
Chinese
https://github.com/didi/ChineseNLP https://chinesenlp.xyz
https://github.com/chineseGLUE/chineseGLUE
自然语言处理研究报告-AMiner研究报告, 清华-中国工程院知识智能联合实验室
-
CCKS 全国知识图谱与语义计算大会
-
CCF 中国计算机学会
CNCC 中国计算机大会
CCFTF (TechFrontier)
CoNLL
CoNLL 评测主要是学术界主导,所以内容多偏向自然语言处理的基础研究问题。
http://universaldependencies.org/conll17/
http://universaldependencies.org/conll18/
CoNLL 2017 Shared Task: Multilingual Parsing from Raw Text to Universal Dependencies
result | paper
CoNLL 2018 Shared Task: Multilingual Parsing from Raw Text to Universal Dependencies
result | paper
Uppsala (Uppsala) - Universal Word Segmentation: Implementation and Interpretation code
-
Recent Trends in Deep Learning Based Natural Language Processing, Tom Young, Devamanyu Hazarika, Soujanya Poria, Erik Cambria, last revised 25 Nov 2018. arxiv
-
Analysis Methods in Neural Language Processing: A Survey, Yonatan Belinkov, James Glass, 2019 TACL. arxiv | code
- Lexical and Neural Networks Combined
- Adversarial Learning
https://github.com/papers-we-love/papers-we-love
机构:
VLG papers_we_read
个人:
zhpmatrix PaperReading
COLING - 欧洲 - 关注语言规律、模型分析(可解释性研究)
ACL - 北美
NAACL - 北美
EMNLP -
2018 - Highlights
-
Why are you telling me this? Relevance & informativity in language processing. slides
Practical Parsing for Downstream Applications. tutorial
-
ACL 2018 Highlights
-
NAACL-HLT 2018 Highlights
-
EMNLP 2018 Highlights
- Shallow Semantic Parsing of Chinese, HLT-NAACL, 2004, Sun, Honglin,Jurafsky, Daniel paper
- Chinese Word Segmentation: Another Decade Review (2007-2017), arxiv
- Analogical Reasoning on Chinese Morphological and Semantic Relations, ACL 2018. paper | code
-
Improving language understanding with unsupervised learning, OpenAI 2018
- language model
-
BERT: Pre-training of Deep Bidirectional Transformers for Language Understanding, Jacob Devlin, Ming-Wei Chang, Kenton Lee, Kristina Toutanova, Google 2018
- language model
-
Generating and Exploiting Large-scale Pseudo Training Data for Zero Pronoun Resolution, Ting Liu, Yiming Cui, et al. , ACL2017 arxiv | slides | Yiming Cui
- data scarcity
- transfer learning
- Unsupervised Named-Entity Extraction from the Web: An Experimental Study, 2005 paper
- Unsupervised Models for Named Entity Classification, Michael Collins and Yoram Singer, 1999 paper ⭐⭐⭐⭐
- Global inference for sentence compression: An integer linear programming approach, J Clarke, M Lapata - Journal of Artificial Intelligence Research, 2008 ⭐⭐⭐
- Language as a Latent Variable: Discrete Generative Models for Sentence Compression, Yishu Miao, Phil Blunsom, EMNLP 2016 arxiv
- Sentence Compression as Tree Transduction, Trevor Anthony Cohn, Mirella Lapata, 2009 arxiv
- Global Inference for Sentence Compression: An Integer Linear Programming Approach James Clarke,Mirella Lapata 2008 code
- Sentence Reduction for Automatic Text Summarization, Hongyan Jing 2000 paper
- A Multi-task Learning Approach for Improving Product Title Compression with User Search Log Data, Jingang Wang, Junfeng Tian, Long Qiu, Sheng Li, Jun Lang, Luo Si, Man Lan, AAAI 2018 arxiv
- A Neural Attention Model for Abstractive Sentence Summarization, Alexander M. Rush, Sumit Chopra, Jason Weston, Facebook, EMNLP 2015 code
- Enhanced LSTM for Natural Language Inference, Qian Chen, Xiaodan Zhu, Zhenhua Ling, Si Wei, Hui Jiang, Diana Inkpen. ACL (2017) arxiv | code
- A Decomposable Attention Model for Natural Language Inference, Ankur P. Parikh, Oscar Täckström, Dipanjan Das, Jakob Uszkoreit, EMNLP 2016 arxiv
- Learning Natural Language Inference with LSTM, Shuohang Wang, Jing Jiang, 2016, arxiv | code
- Convolutional neural network architectures for matching natural language sentences, B Hu, Z Lu, H Li, Q Chen - Advances in neural information processing systems, 2014 paper