Course material for my graduate class on Detection, Estimation, and Learning. The course covers the fundamentals of detection and estimation theory, including hypothesis testing, maximum-likelihood and Bayes estimation, and tracking of linear systems via the Kalman filter. It also covers applications of these ideas to machine learning: regression, classification, feature extraction, and sparse coding and signal recovery.
The file course-notes.pdf is a ~120-page monograph of my lecture materials for the class.
The folders contain Jupyter notebooks with Python code for in-class demonstrations of the course material. They follow the sequence of chapters in the lecture notes, which go as follows:
Chapter 1: Course introduction: review of important concepts from probability theory, linear algebra, and optimization theory.
Chapter 3: Detection theory: Binary and multiple hypothesis testing, ROC curves, minimization of Bayes Risk.
Chapter 4: Parameter estimation: Maximum-likelihood estimation, the MVUE and sufficient statistics. Machine learning applications of estmation theory, including linear regression, logistic regression, PCA, and k-means.
Chapter 5: Bayesian estimation: MMSE/MAP estimators, conjugate priors, and Gaussian signal processing. Machine learning applications including sparse coding/signal processing, ridge regression, and the LASSO.
Chapter 6: Kalman filter: Linear state space models, Kalman filter, extended Kalman filter.