Skip to content

jiangqn/Empirical-Studies-for-Natural-Language-Processing

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

5 Commits
 
 
 
 

Repository files navigation

Empirical-Studies-for-Natural-Language-Processing

A resources list of empirical studies and investigations for natural language processing.

  • [ACL-18] How Much Attention Do You Need? A Granular Analysis of Neural Machine Translation Architectures. [paper]

  • [ACL-18] The Best of Both Worlds: Combining Recent Advances in Neural Machine Translation. [paper]

  • [EMNLP-18] Why Self-Attention? A Targeted Evaluation of Neural Machine Translation Architectures. [paper]

  • [EMNLP-18] An Empirical Study of Machine Translation for the Shared Task of WMT18. [paper]

  • [ACL-19] Do Neural Dialog Systems Use the Conversation History Effectively? An Empirical Study. [paper] [code]

  • [ACL-19] Revisiting Low-Resource Neural Machine Translation:A Case Study. [paper]

This paper shows that the performance of NMT model can be improved by a large margin through carefully hyper-paramaters tuning. Reducing BPE vocabulary size and word dropout are most useful tricks to improve the performance of NMT models when little parallel corpus is available.

  • [EMNLP-19] An Empirical Comparison on Imitation Learning and Reinforcement Learning for Paraphrase Generation. [paper]

  • [EMNLP-19] An Empirical Study of Incorporating Pseudo Data into Grammatical Error Correction. [paper]

  • [EMNLP-19] Investigating the Effectiveness of BPE: The Power of Shorter Sequences. [paper]

  • [EMNLP-19] Are We Modeling the Task or the Annotator? An Investigation of Annotator Bias in Natural Language Understanding Datasets. [paper]

  • [arXiv-19] An Empirical Study of Generation Order for Machine Translation. [paper]

  • [arXiv-19] Exploring the Limits of Transfer Learning with a Unified Text-to-Text Transformer. [paper]

  • [arXiv-19] An Empirical Investigation of Pre-Trained Transformer Language Models for Open-Domain Dialogue Generation. [paper]

About

A resources list of empirical studies for natural language processing.

Topics

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published