A cache based natural language model.
-
Updated
Jun 22, 2021 - Python
A cache based natural language model.
Developed N-gram based and LSTM based Language Models for various channels of social media
A proof-of-concept audio-interactive personalized chatbot based on Ted Mosby, a character from the renowned TV show "How I Met Your Mother"
N-gram models- Unsmoothed, Laplace, Deleted Interpolation
Scripts to train a n-gram language models on Wikipedia articles
This Project is being Developed for SMILe Lab at Bouvè College of Health Sciences at Northeastern University under Dr. Kristen Allison.
Explored the application of an LSTM-based RNN to analyze protein sequences and evaluate its ability to capture long-range dependencies. Generated new protein sequences and created 3-gram language models based on the trained network.
This repository implements N-gram language modeling with Kneser-Kney and Witten Bell smoothing techniques, including an in-house tokenizer. It also features a neural model with LSTM architecture and calculates perplexities for comparing language and neural models.
Generate sentences with Restaurant Call Transcripts
n-gram language model to predict next word in a sequence of words for auto complete purpose.
Add a description, image, and links to the n-gram-language-models topic page so that developers can more easily learn about it.
To associate your repository with the n-gram-language-models topic, visit your repo's landing page and select "manage topics."