Miguel A. Bessa | miguel_bessa@brown.edu | Associate Professor
What: This course aims to be an introduction to machine learning from a probabilistic perspective.
Where: This notebook comes from this repository
Reference: Murphy, Kevin P. Probabilistic machine learning: an introduction. MIT press, 2022. Available online here
How: We try to follow Murphy's book closely, but the sequence of Chapters and Sections is different. The intention is to use notebooks as an introduction to the topic and Murphy's book as a resource.
- If working offline: Go through this notebook and read the book.
- If attending class in person: listen to me (!) but also go through the notebook in your laptop at the same time. Read the book.
- If attending lectures remotely: listen to me (!) via Zoom and (ideally) use two screens where you have the notebook open in 1 screen and you see the lectures on the other. Read the book.
Folder structure
- The "Lectures" folder contains each lecture in a separate folder "LectureX" where X is the lecture number.
- Each "LectureX" folder contains:
- A jupyter notebook "3dasm_LectureX.ipynb" that you can run locally or in servers like Google Colab.
- A pdf "3dasm_LectureX slides.pdf" with the slides of the course.
- A "your_data" folder that you can use to create data or other things in your own computer.
- The preferred method to follow the course is to look directly into the jupyter notebook, as it contains additional notes and working code.
Grading
Homeworks 30%, Midterm 30%, and Final Project 40%.
Homeworks will be graded only with 5 levels: A+ (100%; fully correct), A (90%; has minor error), B (75%; has significant error), C (60%; mostly incorrect but homework was delivered), D (0%, not delivered). If you deliver something with an honest attempt at solving the homework you are assigned 60% for that homework. Late Homework can only get up to A (90%). The worst Homework is removed.
Course outline for the first half
DATE | SUBJECT | Notebook | Homework | |
---|---|---|---|---|
Wed 9/6 | Introduction & Basics of univariate statistics | Lecture 1 | Slides | HW1 assigned |
Fri 9/8 | Handling data with Pandas | Lecture 2 | Slides | |
Mon 9/11 | Introducing joint & conditional distributions; Bayes' rule | Lecture 3 | Slides | |
Wed 9/13 | Multivariate statistics; visualization of joint & conditional distributions | Lecture 4 | Slides | HW1 due & HW2 assigned |
Fri 9/15 | Bayesian inference for one hidden rv: Part I | Lecture 5 | Slides | |
Mon 9/18 | Bayesian inference for one hidden rv: Part II | Lecture 6 | Slides | |
Wed 9/20 | Bayesian inference for one hidden rv: Part III | Lecture 7 | Slides | HW2 due & HW3 assigned |
Fri 9/22 | Machine Learning without going Bayesian: Point Estimates | Lecture 8 | Slides | |
Mon 9/25 | Linear Regression: Part I | Lecture 9 | Slides | |
Wed 9/27 | Linear Regression: Part II | Lecture 10 | Slides | HW3 due & HW4 assigned |
Fri 9/29 | Linear Regression: Part III | Lecture 11 | Slides | |
Mon 10/2 | Linear Regression: Part IV | Lecture 12 | Slides | |
Wed 10/4 | Gaussian process regression: Part I | Lecture 13 | Slides | HW4 due & HW5 assigned |
Fri 10/6 | Gaussian process regression: Part II | Lecture 14 | Slides | |
Mon 10/9 | HOLIDAY 🥹 | |||
Wed 10/11 | Gaussian process regression: Part III | Lecture 15 | Slides | HW5 due & HW6 assigned |
Fri 10/13 | Bayesian model selection | Lecture 16 | Slides Slides | |
Mon 10/16 | Q&A session | |||
Wed 10/18 | Q&A session | HW6 due | ||
Fri 10/20 | No lecture | |||
Mon 10/23 | Midterm exam 🦾 | |||
Wed 10/25 | f3dasm : Framework for Data-driven Design and Analysis of Structures and Materials |
Lecture f3dasm | ||
Fri 10/28 | Project 1: Learning to optimize | Lecture L2O | ||
Mon 10/30 | Project 2: Supercompressible | Lecture Supercompressible | ||
Mon 11/1 | Lecture 20: Classification | Lecture 20 |