Thang Luong's Thesis on Neural Machine Translation
This repository contains the latest version of my thesis.
Motivated by my advisor (Chris Manning)'s suggestion on extreme openness, I started sharing my thesis writing since day 1. While it is not yet clear how useful it is (smile), you can at least find all of the edits made by my advisor, page by page here. Thanks Chris!
Code, data, and models described in this thesis can be found at our Stanford NMT Project Page.
To save your reading time, let me highlight sections that I have put effort in writing beside my published papers:
- Chapter 7 - Conclusion: this will give you a big picture of what I have achieved in my dissertation and how it influences subsequent work.
- Chapter 6 - NMT future (especially section 6.3 - Future Outlook): this is where I highlight potential research directions and speculate on the future.
Additionally, one can also read:
- Chapter 2 - Background: this section, I hope, will be useful for any reader who wants to implement NMT by hand like me where I give details on things like derivations for LSTM gradients.
- Chapter 1 - Introduction: I hope to enlighten readers with the history of MT from the 17th century until now.
Thanks Gabor Angeli for sharing the thesis setup!