Skip to content

This project is about developing a Text classification model using Neural Networks, you will see first-word embeddings with Word2vector, a basic text classifier, then I used LSTM for text classification.

mskabirkhan/Word-Representation-and-Text-Classification-with-Neural-Networks

Repository files navigation

Word-Representation-and-Text-Classification-with-Neural-Networks

This project is divided into four parts. For all parts A to D, here is the breakdown of the tasks...

Part A: Word Embeddings with Word2Vec

  1. Pre-processing the training corpus
  2. Creating the corpus vocabulary and preparing the dataset
  3. Building a skip-gram neural network architecture
  4. Training the models
  5. Getting the Word Embeddings
  6. Exploring and visualizing your word embeddings using t-SNE

Part B: Basic Text Classification

  1. Developed a neural network classifier using one-hot word vectors, trained and evaluated
  2. Modify model to use a word embedding layer instead of one-hot vectors, and to learn the values of these word embedding vectors along with the model Model Summary
  3. Adapt model to load and use pre-trained word embeddings in- stead and train and evaluate it
  4. Improve the performance by adding another fully-connected layer to your network.

Part C: Using LSTMs for Text Classification

  1. Readying the inputs for the LSTM
  2. Building the model
  3. Plot the Training & Validation Accuracy
  4. Evaluating the model on the test data
  5. Extracting the word embeddings
  6. Visualizing the reviews
  7. Visualizing the word embeddings

Part D: A Real Text Classification Task

About

This project is about developing a Text classification model using Neural Networks, you will see first-word embeddings with Word2vector, a basic text classifier, then I used LSTM for text classification.

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published