A Bigram Language Model from scratch with no-smoothing and add-one smoothing. Outputs bigram counts, bigram probabilities and probability of test sentence.
-
Updated
Jan 12, 2021 - Jupyter Notebook
A Bigram Language Model from scratch with no-smoothing and add-one smoothing. Outputs bigram counts, bigram probabilities and probability of test sentence.
Language detection using n-gram model for PyCon PL'2020 lightning talk
This repo is being created to store the code and implementation part of the N- Gram model using both Statistical and Neural Net approach
A cache based natural language model.
Developed N-gram based and LSTM based Language Models for various channels of social media
A proof-of-concept audio-interactive personalized chatbot based on Ted Mosby, a character from the renowned TV show "How I Met Your Mother"
Autocompletion using an n-gram language model.
N-gram models- Unsmoothed, Laplace, Deleted Interpolation
Create n-gram models for word predictions
Scripts to train a n-gram language models on Wikipedia articles
Generating Urdu poetry using SpaCy in Python. Poetry has been generated by using Uni-grams, Bi-grams, Tri-grams and through Bidirectional Bigram Model and Backward Bigram model.
This Project is being Developed for SMILe Lab at Bouvè College of Health Sciences at Northeastern University under Dr. Kristen Allison.
COMP 8730 Assignment 2
Explored the application of an LSTM-based RNN to analyze protein sequences and evaluate its ability to capture long-range dependencies. Generated new protein sequences and created 3-gram language models based on the trained network.
Probabilistic Models in NLP
This repository contains my coursework and projects completed during the Natural Language Processing Specialization offered by DeepLearning.AI.
Add a description, image, and links to the n-gram-language-models topic page so that developers can more easily learn about it.
To associate your repository with the n-gram-language-models topic, visit your repo's landing page and select "manage topics."