SNLI with word-word attention by LSTM encoder-decoder
Lua Python
Switch branches/tags
Nothing to show
Clone or download
Latest commit 4665f9e Dec 9, 2016
Failed to load latest commit information.
data update Dec 9, 2016
model update Dec 9, 2016
util update Dec 9, 2016 update Dec 9, 2016
main.lua update Dec 9, 2016 updated Feb 26, 2016

SNLI task with LSTM Memory Network encoder-dencoder and neural attention

This is an implementation for the deep attention fusion LSTM memory network presented in the paper "Long Short-Term Memory Networks for Machine Reading".

Setup and Usage

This code requires Torch7 and nngraph. It is updated to use torch version around May 2016. Minimum preprocessing is needed to obtain a good accuracy, including lower-casing and tokenization.


  author = {Cheng, Jianpeng and Dong, Li and Lapata, Mirella,
  title = {Long Short-Term Memory Networks for Machine Reading},
  journal = {EMNLP},
  year = {2016},
  pages = {551--562}