Skip to content

dubeyakshat07/Natural-Language-Processing-Tensorflow

Repository files navigation

Natural-Language-Processing-Tensorflow

  1. Parts of Speech tagging using Hidden Marakov Models
    • A Part-Of-Speech Tagger (POS Tagger) is a piece of software that reads text in some language and assigns parts of speech to each word (and other token), such as noun, verb, adjective, etc. Hidden Markov Model (HMM) is a statistical Markov model in which the system being modeled is assumed to be a Markov process with unobserved (i.e. hidden) states. Hidden Markov models are especially known for their application in reinforcement learning and temporal pattern recognition such as speech, handwriting, gesture recognition, part-of-speech tagging, musical score following, partial discharges and bioinformatics.

  2. Topic Modelling using Latent Dirichlet Allocation
    • Topic modeling is a branch of unsupervised natural language processing which is used to represent a text document with the help of several topics, that can best explain the underlying information in a particular document. This can be thought in terms of clustering, but with a difference. Now, instead of numerical features, we have a collection of words that we want to group together in such a way that each group represents a topic in a document.

  3. Machine translation (English to French) using RNN-LSTM
    • We will be creating a sequence-to-sequence model for Machine Translation. The ability to communicate with one another is a fundamental part of being human. There are nearly 7,000 different languages worldwide. As our world becomes increasingly connected, language translation provides a critical cultural and economic bridge between people from different countries and ethnic groups.

  4. Generating poetry using RNN-LSTM
    • A RNN-LSTM based model which generates poetry.

  5. Machine translation (Spanish to English) using Attention Model
    • Here we aim to train a sequence to sequence (seq2seq) model for Spanish to English translation. The model architecture on which we will be focusing is Attention architecture.

  6. English to German Translation using Transformers and Google TRAX
    • Here we aim to train a sequence to sequence (seq2seq) model for English to German translation. The model architecture on which we will be focusing is Transformer architecture using Google Trax.

  7. Jigsaw toxic comments classification using BERT
    • Here we built a multi-headed model that’s capable of detecting different types of of toxicity like threats, obscenity, insults, and identity-based hate better than Perspective’s current models. We used a dataset of comments from Wikipedia’s talk page edits.

  8. NLP with disaster tweets using BERT
    • Here we tried to predict the probability that the tweets were disaster related or nor. We used BERT model, which is SOTA(State-of-the-art) model and can be used for any NLP task after fine-tuning.

  9. Fake News Classifier
    • Here we used a basic bi-directional LSTM based architecture to classify the news.

About

No description, website, or topics provided.

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published