Skip to content

lethienhoa/DL-NLU-Reading

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 
 
 
 
 
 
 

Repository files navigation

Deep Learning and Natural Language Understanding Reading Group

Presenters: Hoa Le, Claire Gardent, Anastasia Shimorina, Denis Paperno. Organizer: Synalp team, Laboratory Loria

Date Presenter Paper
01/03/2018 Hoa Adams Wei Yu, Hongrae Lee, Quoc V. Le. Learning to Skim Text. ACL 2017
08/03/2018 Claire Abigail See, Peter J. Liu, Christopher D. Manning. Get To The Point: Summarization with Pointer-Generator Networks. ACL 2017
15/03/2018 Hoa Limitations of Neural Machine Translation (NMT):
Philipp Koehn and Rebecca Knowles. Six Challenges for Neural Machine Translation. First Workshop on Neural Machine Translation 2017
Christopher Manning, Kyunghuyn Cho, Thang Luong. Neural Machine Translation - Tutorial. ACL 2016
Advancing NMT:
- On Vocabulary aspect
by Softmax scaling: Sébastien Jean, Kyunghyun Cho, Roland Memisevic, Yoshua Bengio. On Using Very Large Target Vocabulary for Neural Machine Translation. ACL 2015
by Copy Mechanism: Thang Luong, Ilya Sutskever, Quoc Le, Oriol Vinyals, Wojciech Zaremba. Addressing the Rare Word Problem in Neural Machine Translation. ACL 2015
by byte-pair encoding: Rico Sennrich, Barry Haddow, Alexandra Birch. Neural Machine Translation of Rare Words with Subword Units. ACL 2016
- On Memory aspect
by Global and Local Attention: Thang Luong, Hieu Pham, and Chris Manning. Effective Approaches to Attention-based Neural Machine Translation. EMNLP 2015
by Coverage mechanism: Zhaopeng Tu, Zhengdong Lu, Yang Liu, Xiaohua Liu and Hang Li. Modeling coverage for neural machine translation. ACL 2016
- On Language Complexity aspect
by Sub-word modeling: Thang Luong and Chris Manning. Achieving Open Vocabulary Neural Machine Translation with Hybrid Word-Character Models. ACL 2016
- On Data aspect
by using Monolingual data: Rico Sennrich, Barry Haddow, and Alexandra Birch. Improving Neural Machine Translation Models with Monolingual Data. ACL 2016
by learning Multi-lingual and combining Multi-task: Thang Luong, Quoc Le, Ilya Sutskever, Oriol Vinyals, Lukasz Kaiser. Multi-task sequence to sequence learning. ICLR 2016
22-29/03/2018 Hoa Hassan et al., Achieving Human Parity on Automatic Chinese to English News Translation. Microsoft research preprint 2018. [Summary slides], [Dual Learning summary (external link)]
3 major components/techniques:
- Dual learning:
Dual unsupervised learning He et al., Dual Learning for Machine Translation. NIPS 2016
Dual supervised learning Xia et al., Dual Supervised Learning. ICML 2017
- Joint training of S2T and T2S:
Gulcehre et al., On Using Monolingual Corpora in Neural Machine Translation. Arxiv 2015
Back translation Sennrich et al., Improving Neural Machine Translation Models with Monolingual Data. ACL 2016
Joint training Zhang et al., Joint Training for Neural Machine Translation Models with Monolingual Data. AAAI 2018
- Deliberation Networks Xia et al., Deliberation Networks: Sequence Generation Beyond One-Pass Decoding. NIPS 2017
05/04/2018 Hoa Learning language representation with Autoencoders (AEs): [Slides]
(CNN-DCNN) Autoencoder (AE): Yizhe Zhang, Dinghan Shen, Guoyin Wang, Zhe Gan, Ricardo Henao, Lawrence Carin. Deconvolutional Paragraph Representation Learning. NIPS 2017
(Sequential) Denoising Autoencoder (DAE): Felix Hill, Kyunghyun Cho, Anna Korhonen. Learning Distributed Representations of Sentences from Unlabelled Data. NAACL-HLT 2016
Variational Autoencoder (VAE): Samuel R. Bowman, Luke Vilnis, Oriol Vinyals, Andrew M. Dai, Rafal Jozefowicz, Samy Bengio. Generating Sentences from a Continuous Space. CoNLL 2016
12/04/2018 Anastasia Data-to-Text generation: Albert Gatt and Emiel Krahmer. Survey of the State of the Art in Natural Language Generation: Core tasks, applications and evaluation. Journal of Artificial Intelligence Research 2018. [Summary slides]
19/04/2018 Hoa Generative Adversarial Networks (GAN): [External Slides], [Supplementaries]
Goodfellow et al., Generative Adversarial Networks. NIPS 2014
Ian Goodfellow. NIPS 2016 Tutorial: Generative Adversarial Networks. Arxiv 2017
26/04/2018 Hoa Adversarial Autoencoder (AAE): Alireza Makhzani, Jonathon Shlens, Navdeep Jaitly, Ian Goodfellow. Adversarial Autoencoders. ICLR 2016 [External Slides], [Supplementaries]
03/05/2018 Hoa Variants of seq2seq models in PyTorch & Tensorflow (practical)
10/05/2018 Hoa Professor Forcing (GAN) and Scheduled Sampling (Curriculum Learning) [External Slides], [Supplementaries]:
Alex Lamb, Anirudh Goyal, Ying Zhang, Saizheng Zhang, Aaron Courville, Yoshua Bengio. Professor Forcing: A New Algorithm for Training Recurrent Networks. NIPS 2016
Samy Bengio, Oriol Vinyals, Navdeep Jaitly, Noam Shazeer. Scheduled Sampling for Sequence Prediction with Recurrent Neural Networks. NIPS 2015
Ferenc Huszar. How (not) to Train your Generative Model: Scheduled Sampling, Likelihood, Adversary?. Arxiv 2015
17/05/2018 Denis Purely unsupervised machine translation (1) [Slides]
Guillaume Lample, Ludovic Denoyer, Marc'Aurelio Ranzato. Unsupervised Machine Translation Using Monolingual Corpora Only. ICLR 2018
Mikel Artetxe, Gorka Labaka, Eneko Agirre, Kyunghyun Cho. Unsupervised Neural Machine Translation. ICLR 2018
Hoa Static & Dynamic RNN in Tensorflow (practical)
24/05/2018 Claire "Neural Approaches to Text Production" Tutorial (practical)
07/06/2018 Hoa Purely unsupervised machine translation (2)
Lample et al., Phrase-Based & Neural Unsupervised Machine Translation. Arxiv 2018
Conneau et al., Word Translation Without Parallel Data. ICLR 2018
14/06/2018 Hoa Ganin et al., Domain-Adversarial Training of Neural Networks. JMLR 2016
17/12/2018 Hoa RL for sequence prediction litterature (Slides)
27/03/2019 Hoa Graph Neural Networks litterature (Slides)

Team meeting presentation:

Date Topic
21/03/2019 How much can Syntax help Sentence Compression ?
5/4/2019 Delete-and-Paraphrase

Other documents:

About

List of papers and organized presentations of Deep Learning and Natural Language Understanding Reading Group, Synalp team, Laboratory Loria, France

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published