Skip to content
Advanced Topics in Statistical Machine Learning, Oxford Statistics, Hilary Term 2020
Branch: master
Clone or download

Latest commit

Fetching latest commit…
Cannot retrieve the latest commit at this time.

Files

Permalink
Type Name Latest commit message Commit time
Failed to load latest commit information.
problemsheets notes to Bayesian ML, and sheet 3 Feb 26, 2020
slides slides03 Feb 7, 2020
LICENSE Initial commit Jan 23, 2020
README.md Update README.md Feb 27, 2020
notes.pdf notes to Bayesian ML, and sheet 3 Feb 26, 2020

README.md

SC4/SM8 Advanced Topics in Statistical Machine Learning

Announcements:

Links:

Course Information

Aims and objectives:

Machine learning is widely used across the sciences, engineering and society, to construct methods for identifying interesting patterns and predicting accurately from large datasets. This course introduces several widely used machine learning frameworks and describes their underpinning statistical principles and properties. The course studies both unsupervised and supervised learning and several advanced and state-of-the-art topics are covered in detail. The course will also cover computational considerations of machine learning algorithms and how they can scale to large datasets.

Prerequisites:

A8 Probability and A9 Statistics.
Some material from this year's syllabus of SB2.2 Statistical Machine Learning, PCA and the basics of clustering, will be used (which is mainly taught in the first three lectures of SB2.2, also in HT2019), but SB2.2 is not a prerequisite and background notes will be provided.

Synopsis:

  • Review of unsupervised and supervised learning.
  • Duality in convex optimization and support vector machines.
  • Kernel methods and reproducing kernel Hilbert spaces. Representer theorem. Representation of probabilities in RKHS.
    Kernel PCA.
  • Deep learning. Representation learning. Neural networks and computation graphs. Automatic differentiation. Stochastic gradient descent.
  • Probabilistic machine learning: latent variable models, EM algorithm, mixtures, mixtures of experts, probabilistic PCA.
  • Variational inference, deep generative models, variational auto-encoders.
  • Bayesian learning: Laplace Approximation. Variational Bayes, Latent Dirichlet Allocation.
  • Collaborative filtering models, probabilistic matrix factorization.
  • Gaussian processes for regression and classification. Bayesian optimization.

Textbooks and Background Reading

Recommended textbooks:

  • Bishop, Pattern Recognition and Machine Learning, Springer.
  • Murphy, Machine Learning: A Probabilistic Perspective, MIT Press.
  • Hastie, Tibshirani and Friedman, The Elements of Statistical Learning, Springer. ebook
  • Shalev-Shwartz and Ben-David, Understanding Machine Learning: From Theory to Algorithms, Cambridge University Press.
  • Ian Goodfellow, Yoshua Bengio and Aaron Courville, Deep Learning, MIT Press. website

Background Review Aids:

Software

R

Python

Jupyter notebooks

Knowledge of Python is not required for this course, but some descriptive examples in lectures may be done in Python. Students interested in further Python training are referred to the free University IT online courses.

Special Guest Lectures:

There will be a series of special guest lectures on (even more) advanced topics in machine learning. These will be 1.5-2 hours in length, with first half being more pedagogical introduction to an area and second half a research seminar. These are not examinable.

  • Some Thursdays in LG.01
  • Feb 13 1300-1500 week 4: Nicolas Heess and Leonard Hasenclever (DeepMind) on reinforcement learning and control
  • Feb 27 1330-1500 week 6: Arthur Gretton (Gatsby Unit, UCL)
  • Mar 5 week 7: Razvan Pascanu (DeepMind) on looking at data efficiency from the learning algorithm perspective
  • Mar 6 Friday 330-430pm week 8: Max Welling (Amsterdam, Qualcomm) (departmental distinguished seminar)
  • Mar 12 1300-1500 week 8: Silvia Chiappa (DeepMind)

Course Materials

The course materials will appear here before the course starts. They consist of notes, slides, and Jupyter notebooks. Notes may not be exhaustive and should be used in conjunction with the slides. All materials may be updated during the course and are thus best read on screen. Please email me any typos or corrections.

Notes:

Slides:

  • slides01: Admin, PCA, K-means, empirical risk minimisation
  • slides02: Convex duality, SVMs, kernels

Problem Sheets:

  • sheet1 due Thursday noon week 3

Other Resources:

Information:

Information for Part C and OMMS Students

Information for MSc Statistical Science Students

  • Classes are Fridays 1500-1600 weeks 3,5,8,TT1 in LG.01

Information for DPhil and CDT students

  • Assessment will be via reproducibility challenge projects.
  • Aim is to reproduce recent ML conference papers.
  • Papers assigned in weeks 7,8.
  • 4 page reports, open sourced software repositories, and 20 min presentation due in early TT.
You can’t perform that action at this time.