Skip to content

satyajitghana/TSAI-DeepNLP-END2.0

Repository files navigation

TSAI-DeepNLP-END2.0



Best viewed on the website

NOTE: There might be dependencies issues with torchtext version used in the notebooks, please see this which may solve the issue



  1. Very Basics

    This describes all the basics of a neural network, how gradient descent works, learning rate, fully connected neurons, chain rule, etc.

  2. BackProp

    We built a Neural Network in a damn spreadsheet :)

  3. PyTorch 101

    Basics of PyTorch. Here i built a custom MNIST model, that can classify MNIST Image as well as do addition of that predicted image with a random integer.

  4. RNN & LSTMS

    Built an LSTM From Scratch and Trained an IMDb Sentiment analysis classifier using RNN & LSTM with PyTorch Text.

  5. LSTM & NLP Augmentation

    Trained a LSTM Model on the SST Dataset. And did a lot of NLP Augmentations.

  6. Encoder Decoder

    A Simple Encoder-Decoder Architecture, but for Classification ! I got to learn how encoder-decoder's work, and how the feature vector is used to compress and extract information :)

  7. Seq2Seq

    Simple Sequence-to-Sequence Model for Question Answer and Similar Question Generation

    Also includes a redo of 5 without any augmentation. 05 Redo

  8. TorchText

    Introduction to the new TorchText APIs, and deprecation of torchtext.legacy

    Here we convert few notebooks with legacy code to the modern torchtext 0.9.0+

  9. NLP Evaluation Metrics

    Classification Metrics: F1, Precision, Recall, Confusion Matrix

    Machine Translation Metrics: Perplexity, BLEU, BERTScore

  10. Seq2Seq w/ Attention

    Here we modified the Seq2Seq model to use attention mechanism.

  11. Attention Advanced

    Turns out there are a lot of things to learn about attention mechanism. There's the Bahdanau et al. paper, and the Luong et al. paper. In this assignment we implement a version of the Luong general attention model.

  12. Attention Is All You Need

    A Vanilla Implementation of AIAYN Paper (https://arxiv.org/abs/1706.03762)

  13. AIAYN Recap

    Recap of Attention Is All You Need

  14. BERT BART

    BERT: Used it for a question answering model and sentence classification model. BART: Used it for paraphrasing model.