Skip to content

Python implementation of N-gram Models, Log linear and Neural Linear Models, Back-propagation and Self-Attention, HMM, PCFG, CRF, EM, VAE

Notifications You must be signed in to change notification settings

keya-desai/Natural-Language-Processing

Repository files navigation

Natural-Language-Processing

Coursework for CS533 : Natural Language Processing

Assignments:

Assignment 1 - n-gram models
Assignment 2 - Log-linear and Neural Linear Models
Assignment 3 - Back-propagation and Self-Attention
Assignment 4 - Hidden Markov Models (HMM), Probabilistic Context Free Grammar (PCFG), Conditional Random Field (CRF)
Assignment 5 - Expectation Maximisation (EM), Variational Auto-Encoder (VAE)

Projects:

Implementation of the paper : Not all attention is needed - Gated Attention Network for Sequence Data (GA-Net) Repo : https://github.com/keya-desai/Gated-Attention

About

Python implementation of N-gram Models, Log linear and Neural Linear Models, Back-propagation and Self-Attention, HMM, PCFG, CRF, EM, VAE

Topics

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages