Introduction to Uncertainty Quantification
This version of the course is being taught at Purdue University during Spring 2018. The code for the course is ME 59700 and MA 598. The instructor is Prof. Ilias Bilionis. The class meets every Tuesday and Thursday 12:00pm-1:15pm at GRIS 102.
The goal of this course is to introduce the fundamentals of uncertainty quantification to advanced undergraduates or graduate engineering and science students with research interests in the field of predictive modeling. Upon completion of this course the students should be able to:
- Represent mathematically the uncertainty in the parameters of physical models.
- Propagate parametric uncertainty through physical models to quantify the induced uncertainty on quantities of interest.
- Calibrate the uncertain parameters of physical models using experimental data.
- Combine multiple sources of information to enhance the predictive capabilities of models.
- Pose and solve design optimization problems under uncertainty involving expensive computer simulations.
- 10% Participation
- 60% Homework
- 30% Final Project
Lecture 1 - Introduction, 01/09/2018.
Lecture 2 - Quantifying Uncertainties in Physical Models, 01/11/2018.
Lecture 3 - Introcution to Probability Theory (Part I), 01/16/2018.
Lecture 4 - Introduction to Probability Theory (Part II), 01/18/2018.
Lecture 5 - Common Random Variables, 01/23/2018.
Lecture 6 - Turning Prior Information to Probability Statements, 01/25/2018.
Lecture 7 - Generalized Linear Models (Part I), 01/30/2018.
Lecture 8 - Generalized Linear Models (Part II), 02/01/2018.
Lecture 9 - Generalized Linear Models (Part III), 02/01/2018.
Lecture 10 - Priors on Function Spaces, 02/08/2018.
Lecture 11 - Conditioning a Random Field on Observations, 02/13/2018.
Lecture 12 - Reducing the Dimensionality of Random Fields, 02/15/2018.
Lecture 13 - Uncertainty Propagation: Sampling Methods I, 02/20/2018.
Lecture 14 - Uncertainty Propagation: Sampling Methods II, 02/22/2018.
Lecture 15 - Uncertainty Propagation: Perturbation Methods, 02/27/2018.
Lecture 16 - Uncertainty Propagation: Polynomial Chaos I, 03/01/2018.
Lecture 17 - Uncertainty Propagation: Polynomial Chaos II, 03/06/2018.
Lecture 18 - Uncertainty Propagation: Polynomial Chaos III, 03/08/2018.
No lecture on Tuesday 03/12/2018 (spring break).
No lecture on Thursday 03/15/2018 (spring break).
Lecture 19 Inverse Problems/Model Calibration: Classic Approaches, 03/20/2018.
No lecture on Thursday 03/22/2018 (The instructor will be at 2018 NSF Design Circle Workshop: Designing and Developing Global Engineering Systems).
Lecture 20 - Inverse Problems/Model Calibration: Bayesian Approaches, 03/27/2018.
Lecture 21 - Markov Chain Monte Carlo I, 03/29/2018.
Lecture 22 - Markov Chain Monte Carlo II, 04/03/2018.
Lecture 23 - Markov Chain Monte Carlo III, 04/05/2018.
- Topics: Hierarchical Bayes examples; Logistic regression; PyMC tutorial.
- Slides: No slides. This is a hands-on section.
Lecture 24 - Bayesian Model Selection I, 04/10/2018.
Lecture 25 - Bayesian Model Selection II, 04/12/2018.
- Topics: PySMC tutorial.
- Slides: No slides. This is a hands-on section.
No lecture on Tuesday 04/17/2018 (The instructor will be at the SIAM Conference for Uncertainty Quantification 2018).
No lecture on Thursday 04/19/2018 (The instructor will be at the SIAM Conference for Uncertainty Quantification 2018).
Lecture 26 - Accelerating Bayesian Statistics, 04/24/2018.
Lecture 27 - Bayesian Algorithms for Solving Stochastic Optimization Problems with Expensive Information Sources, 04/26/2018.
Homework 1 - Probability Theory Basics, due 01/23/2018.
Homework 2 - Choosing Prior Probabilities, due 01/30/2018.
Homework 3 - Bayesian Linear Regression, due 02/15/2018.
Homework 4 - Gaussian process regression and KL expansion: due 03/01/2018.
Homework 6 - Polynomial Chaos and Stochastic Collocation method, due 03/27/2018.
Installation of Required Software for Viewing the Notebookes
Find and download the right version of Anaconda for Python 2.7 from Continuum Analytics. This package contains most of the software we are going to need.
OS Specific Instructions
- We need C, C++, Fortran compilers, as well as the Python sources.
Start a command line (look for
cmd) and type:
conda install mingw libpython
- Finally, you need git. As you install it, make sure you select that you want to use it from the Windows command prompt.
Apple OS X
- Download and install Xcode
- Agree to the license of Xcode by opening a terminal and typing:
sudo xcrun cc
- Install your favorite version of the GNU compiler suite. You can do this with Homebrew (after you install it of course), by typing in the terminal:
brew install gcc
Alternatively, you may use the MacPorts.
Nothing special is required.
Installation of Required Python Packages
Independently of the operating system, use the command line to install the following Python packages:
- Seaborn, for beatiful graphics:
conda install seaborn
- PyMC for MCMC sampling:
conda install pymc
- GPy for Gaussian process regression:
pip install GPy
- py-design for generating designs for computer codes:
pip install py-design
- py-orthpol for generating orthogonal polynomials with respect to arbitrary probability measures:
pip install py-orthpol
Running the notebooks
- Open the command line.
cdto your favorite folder.
- Then, type:
git clone https://github.com/PredictiveScienceLab/uq-course.git
- This will download the contents of this repository in a folder called
- Enter the
- Start the jupyter notebook by typing the command:
- Use the browser to navigate the course, experiment with code etc.
- If the course contented is updated, type the following command (while being inside
uq-course) to get the latest version:
git pull origin master
Keep in mind, that if you have made local changes to the repository, you may have to commit them before moving on.