Skip to content
master
Go to file
Code

Latest commit

 

Git stats

Files

Permalink
Failed to load latest commit information.
Type
Name
Latest commit message
Commit time
 
 
 
 
 
 
 
 
 
 
 
 

README.md

SNLI task with LSTM Memory Network encoder-dencoder and neural attention

This is an implementation for the deep attention fusion LSTM memory network presented in the paper "Long Short-Term Memory Networks for Machine Reading".

Setup and Usage

This code requires Torch7 and nngraph. It is updated to use torch version around May 2016. Minimum preprocessing is needed to obtain a good accuracy, including lower-casing and tokenization.

Citation

@article{cheng2016,
  author = {Cheng, Jianpeng and Dong, Li and Lapata, Mirella,
  title = {Long Short-Term Memory Networks for Machine Reading},
  journal = {EMNLP},
  year = {2016},
  pages = {551--562}
}

About

SNLI with word-word attention by LSTM encoder-decoder

Resources

Releases

No releases published

Packages

No packages published
You can’t perform that action at this time.