Skip to content

UBC-MDS/DSCI_575_adv-mach-learn

master
Switch branches/tags
Code

Latest commit

 

Git stats

Files

Permalink
Failed to load latest commit information.
Type
Name
Latest commit message
Commit time
Jan 6, 2021

DSCI 575: Advanced Machine Learning

Advanced machine learning methods in the context of natural language processing (NLP) applications. Word embeddings, Markov chains, hidden Markov models, topic modeling, recurrent neural networks.

2019/20 Instructor: Varada Kolhatkar

Course Learning Outcomes

By the end of the course, students are expected to be able to

  • Explain and use word embeddings for word meaning representation.
  • Train your own word embedding and use pre-trained word embeddings.
  • Specify a Markov chain and carry out generation and inference with them.
  • Explain the general idea of stationary distribution in Markov chains.
  • Explain hidden Markov models and carry out decoding with them.
  • Explain Latent Dirichlet Allocation (LDA) approach to topic modeling and carry out topic modeling on text data.
  • Explain Recurrent Neural Networks (RNNs) and use them for classification, generation, and image captioning.

All videos are available here.

Tentative schedule

Lecture Topic Notes Resources and optional readings
1 Word vectors, word embeddings Notes Word2Vec papers:
  • Distributed representations of words and phrases and their compositionality
  • Efficient estimation of word representations in vector space
  • word2vec Explained
  • Debiasing Word Embeddings
  • 2 Using word embeddings, text preprocessing Notes
  • Dan Jurafsky's video on tokenization
  • 3 Markov Models Notes
  • Markov chains in action
  • Dan Jurafsky's videos on PageRank
  • 4 Hidden Markov models Notes
  • Nando de Freitas' lecture on HMMs
  • A gentle intro to HMMs by Eric Fosler-Lussier
  • 5 Topic modeling Notes Dave Blei video lecture, paper
    6 Introduction to Recurrent Neural Networks (RNNs) Notes
  • The Unreasonable Effectiveness of Recurrent Neural Networks
  • Sequence Processing with Recurrent Networks
  • 7 Long short term memory networks (LSTMs) Notes Visual step-by-step explanation of LSTMs
    8 Image captioning using CNNs and RNNs and wrap up Notes Jeff Heaton's video

    Resources

    Books