A proof-of-concept audio-interactive personalized chatbot based on Ted Mosby, a character from the renowned TV show "How I Met Your Mother"
-
Updated
Aug 25, 2021 - Python
A proof-of-concept audio-interactive personalized chatbot based on Ted Mosby, a character from the renowned TV show "How I Met Your Mother"
Developed N-gram based and LSTM based Language Models for various channels of social media
Scripts to train a n-gram language models on Wikipedia articles
N-gram models- Unsmoothed, Laplace, Deleted Interpolation
n-gram language model to predict next word in a sequence of words for auto complete purpose.
Generate sentences with Restaurant Call Transcripts
This Project is being Developed for SMILe Lab at Bouvè College of Health Sciences at Northeastern University under Dr. Kristen Allison.
A cache based natural language model.
Explored the application of an LSTM-based RNN to analyze protein sequences and evaluate its ability to capture long-range dependencies. Generated new protein sequences and created 3-gram language models based on the trained network.
This repository implements N-gram language modeling with Kneser-Kney and Witten Bell smoothing techniques, including an in-house tokenizer. It also features a neural model with LSTM architecture and calculates perplexities for comparing language and neural models.
Add a description, image, and links to the n-gram-language-models topic page so that developers can more easily learn about it.
To associate your repository with the n-gram-language-models topic, visit your repo's landing page and select "manage topics."