Bayesian Statistics I / Bayesian Learning Course
Clone or download
Permalink
Type Name Latest commit message Commit time
Failed to load latest commit information.
Code Lab 4 Dec 22, 2018
Labs Moved contents between labs Dec 22, 2018
MathExercises no message Dec 11, 2018
Misc photos Dec 16, 2018
OldExamsLiU initial commit Nov 24, 2018
Slides Update slides. Minor stuff Jan 14, 2019
.gitignore no message Dec 2, 2018
README.html Minor changes in slides. Added links to code Dec 6, 2018
README.md Update slides. Minor stuff Jan 14, 2019

README.md

Bayesian Statistics I


Aims

The course aims to give a solid introduction to the Bayesian approach to statistical inference, with special emphasis on models in modern statistics and machine learning. After an introduction to the subjective probability concept that underlies Bayesian inference, the course moves on to the mathematics of the prior-to-posterior updating in basic statistical models, such as the Bernoulli, normal and multinomial models. Linear regression and nonlinear regression are also analyzed using a Bayesian approach. The course subsequently shows how complex models can be analyzed with simulation methods like Markov Chain Monte Carlo (MCMC) or approximate methods like Variational Inference. Bayesian prediction and marginalization of nuisance parameters is explained, and introductions to Bayesian model selection and Bayesian decision theory are also given.


Course literature and Schedule

The course will use the following book as the main course literature:

  • Gelman, Carlin, Stern, Dunson, Vehtari, Rubin (2014). Bayesian Data Analysis. Chapman & Hall/CRC: Boca Raton, Florida. 3rd edition.
  • My slides on this page
  • Additional course material linked from this page, such as articles and tutorial.

The course schedule on TimeEdit is here: Schedule.


Computer labs

  • The computer labs is a central part of this course and you should expect to allocate substantial time for each lab. Many of the exam questions will be computer based, so working on the labs will also help you with the exam.
  • You are strongly encouraged to do the labs in R, but any programming language is ok to use.
  • The labs should be done in pairs of students.
  • Each lab report should be submitted as a PDF along with the .R file with code. Submission is done through the Mondo system.
  • There is only two hours of supervised time allocated to each lab. This will not be enough time to complete each lab. The idea is that you should start working on the lab before the computer session, so that you are in a position to ask questions at the session.

Teachers


Mattias Villani, Lecturer
Professor
Department of Statistics, Stockholm University
Division of Statistics and Machine Learning, Linköping University


Oscar Oelrich, Assistant
PhD Candidate
Department of Statistics, Stockholm University


Munezero Parfait, Assistant
PhD Candidate
Department of Statistics, Stockholm University


Part 1 - The Basics

Lecture 1 - Basics concepts. Likelihood. The Bernoulli model.
Reading: BDA Ch. 1, 2.1-2.4 | Slides
Code: Beta density | Bernoulli model

Lecture 2 - Gaussian model. The Poisson model. Conjugate priors. Prior elicitation.
Reading: BDA Ch. 2.-2.9 | Slides
Code: One-parameter Gaussian model

Lecture 3 - Multi-parameter models. Marginalization. Multinomial model. Multivariate normal model.
Reading: BDA Ch. 3 | Slides
Code: Two-parameter Gaussian model | Multinomial model

Exercise session 1
Exercises: Exercise 1
Assistant: Munezero Parfait

Computer Lab 1 - Exploring posterior distributions in one-parameter models by simulation and direct numerical evaluation.
Lab: Lab 1
Assistant: Munezero Parfait
Submission tool: Mondo.


Part 2 - Bayesian Regression and Classification

Lecture 4 - Prediction. Making Decisions.
Reading: BDA Ch. 9.1-9.2. | Slides
Code: Prediction with two-parameter Gaussian model

Lecture 5 - Linear Regression. Nonlinear regression. Regularization priors.
Reading: BDA Ch. 14 and Ch. 20.1-20.2 | Slides

Lecture 6 - Classification. Posterior approximation. Logistic regression.
Reading: BDA Ch. 16.1-16.3 | Slides
Code: Logistic and Probit Regression

Exercise session 2
Exercises: Exercise 2
Assistant: Oscar Oelrich

Computer Lab 2 - Polynomial regression and classification with logistic regression.
Lab: Lab 2 | Temp in Linköping dataset | WomenWork dataset
Assistant: Oscar Oelrich
Submission tool: Mondo.


Part 3 - More Advanced Models, MCMC and Variational Bayes

Lecture 7 - Monte Carlo simulation. Gibbs sampling. Data augmentation.
Reading: BDA Ch. 10-11 | Slides
Code: Gibbs sampling for a bivariate normal | Gibbs sampling for a mixture of normals
Extra material: Illustration of Gibbs sampling when parameters are strongly correlated.

Lecture 8 - MCMC and Metropolis-Hastings
Reading: BDA Ch. 11 | Slides
Code: Simulating Markov Chains

Lecture 9 - HMC, Variational Bayes and Stan.
Reading: BDA Ch. 12.4 and Ch. 13.7 | RStan vignette | Slides
Code: RStan - Three Plants | RStan - Bernoulli model | RStan - Logistic regression | RStan - Logistic regression with random effects | RStan - Poisson model

Computer Lab 3 - Gibbs sampling for the normal model, mixture of normals. Metropolis-Hastings for Poisson regression.
Lab: Lab 3 | Rainfall dataset | eBay dataset | How to code up Random Walk Metropolis
Assistant: Munezero Parfait
Submission tool: Mondo.


Part 4 - Model Inference and Variable Selection

Lecture 10 - Bayesian model comparison.
Reading: BDA Ch. 7 | Slides
Code: Comparing models for count data

Lecture 11 - Computing the marginal likelihood, Bayesian variable selection, model averaging.
Reading: Article on Bayesian variable selection | Slides

Lecture 12 - Model evaluation and course summary.
Reading: BDA 6.1-6.4 | Slides

Computer Lab 4 - HMC for time series in RStan.
Lab: Lab 4 | campylobacter dataset
Assistant: Oscar Oelrich
Submission tool: Mondo.


Examination

The course examination consists of:

  • Written lab reports (deadlines will be given in Mondo)
  • Take home exam:
    • 1st attempt: handed out on January 14. To be returned on January 18, at 6 PM. The exam will be available in Mondo here, where you also submit your solutions.
    • 2nd attempt: handed out on TBD. To be returned on TBD.

R