- tensorflow r1.1
- Neural Machine Translation by Jointly Learning to Align and Translate
- Learning Phrase Representations using RNN Encoder–Decoder for Statistical Machine Translation
- A Neural Conversational Model
- A Hierarchical Recurrent Encoder-Decoder for Generative Context-Aware Query Suggestion
- Attention with Intention for a Neural Network Conversation Model
This the result trained with 1400 movies and tv show subtitles around 1.4m sentences.
- Anti-language model to suppress generic response
- MMI-loss as objective function
- Reinforcement Learning?