Skip to content

soumith/treelstm.pytorch

 
 

Repository files navigation

Tree-Structured Long Short-Term Memory Networks

A PyTorch based implementation of Tree-LSTM from Kai Sheng Tai's paper Improved Semantic Representations From Tree-Structured Long Short-Term Memory Networks.

Requirements

  • PyTorch
  • tqdm
  • Java >= 8 (for Stanford CoreNLP utilities)
  • Python >= 2.7

Usage

First run the script ./fetch_and_preprocess.sh, which downloads:

The preprocessing script also generates dependency parses of the SICK dataset using the Stanford Neural Network Dependency Parser.

To try the Dependency Tree-LSTM from the paper to predict similarity for pairs of sentences on the SICK dataset, run python main.py to train and test the model, and have a look at config.py for command-line arguments.

The first run takes a few minutes because the GLOVE embeddings for the words in the SICK vocabulary will need to be read and stored to a cache for future runs. In later runs, only the cache is read in during later runs.

This code with --lr 0.01 --wd 0.0001 --optim adagrad --batchsize 25 gives a Pearson's coefficient of 0.8336 and a MSE of 0.3119, as opposed to a Pearson's coefficient of 0.8676 and a MSE of 0.2532 in the original paper. The difference might be because of differences in the way the word embeddings are updated.

Notes

PyTorch 0.1.12 has support for sparse tensors in both CPU and GPU modes. This means that nn.Embedding can now have sparse updates, potentially reducing memory usage. Enable this by the --sparse argument, but be warned of two things:

  • Sparse training has not been tested by me. The code works, but performance has not been benchmarked for this code.
  • Weight decay does not work with sparse gradients/parameters.

Acknowledgements

Shout-out to Kai Sheng Tai for the original LuaTorch implementation, and to the Pytorch team for the fun library.

Author

Riddhiman Dasgupta

This is my first PyTorch based implementation, and might contain bugs. Please let me know if you find any!

License

MIT

About

Tree LSTM implementation in PyTorch

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages

  • Python 70.2%
  • Java 29.3%
  • Shell 0.5%