Model implementations for some classical NLP models, following Rutgers CS533 course.
This repo provides some workable, reproductatble, and easy-to-use implementations for various NLP models including basic trigram, log-linear model, feed-forward neural language model, bigram HMM, LSTM with attention and BiLSTM with CRF.
Rutgers CS533 course instructor, Professor Karl adapted some of the codes from COS 484 course at Princeton, designed by Danqi Chen and Karthik Narasimhan.
-
Ngram model: classical probalistic language model
-
Log linear language model
-
Feed forward neural language model
-
Hidden markov model: sample code for bigram case
-
Seq2Seq with attention: Sequence-to-Sequence model with attention mechanism, also the implementation for computing BLEU score
-
BiLSTM with CRF inference layer: a tagger based on BiLSTM with CRF inference layer