Code for "Does Higher Order LSTM Have Better Accuracy for Segmenting and Labeling Sequence Data?"
Switch branches/tags
Nothing to show
Clone or download
Fetching latest commit…
Cannot retrieve the latest commit at this time.
Permalink
Failed to load latest commit information.
data
file_1o/saved_1o_model
file_2o/saved_2o_model
lstm-1order
lstm-2order
Datahelpers.py
README.md
conlleval
main.py
model.py

README.md

High-Order-LSTM

This is an implementation of the paper [Does Higher Order LSTM Have Better Accuracy for Segmenting and Labeling Sequence Data?] [pdf].

Environment and Dependency

  • Ubuntu 16.04
  • Python 2.7
  • Tensorflow 1.0

Required Files

Feature files

The model uses features extracted from original texts. Ignore the feature input in the model if you don't want to use extracted features.

Probability Files

The model uses features extracted from original texts. Ignore the feature input in the model if you don't want to use extracted features. The multi-order-3 LSTM model uses the probabilities generated by single order-1 model and single order-2 model at testing stage. So the probabilities need to be preserved in files. We provied the pretrained order-1 model and order-2 model which are in the lstm-1order file and the lstm-2order file separately. You can use these two models to generate the probability files. You can also get the files by training your own single order-n models. The single order-n model is exactly bi-directional lstm with order-n tag set.