This repository is a collection of notebooks about Bayesian Machine Learning. The following links display the notebooks via nbviewer to ensure a proper rendering of formulas.
-
Latent variable models - part 1: Gaussian mixture models and the EM algorithm. Introduction to the expectation maximization (EM) algorithm and its application to Gaussian mixture models. Example implementation with plain NumPy/SciPy and scikit-learn for comparison.
-
Latent variable models - part 2: Stochastic variational inference and variational autoencoders. Introduction to stochastic variational inference with variational autoencoder as application example. Implementation with Tensorflow 2.x.
-
Variational inference in Bayesian neural networks. Demonstrates how to implement and train a Bayesian neural network using a variational inference approach. Example implementation with Keras.
-
Bayesian regression with linear basis function models. Introduction to Bayesian linear regression. Implementation from scratch with plain NumPy as well as usage of scikit-learn for comparison.
-
Gaussian processes. Introduction to Gaussian processes. Example implementations with plain NumPy/SciPy as well as with libraries scikit-learn and GPy.
-
Bayesian optimization. Introduction to Bayesian optimization. Example implementations with plain NumPy/SciPy as well as with libraries scikit-optimize and GPyOpt. Hyperparameter tuning as application example.
-
Deep feature consistent variational auto-encoder. Describes how a perceptual loss can improve the quality of images generated by a variational auto-encoder. Example implementation with Keras.
-
Conditional generation via Bayesian optimization in latent space. Describes an approach for conditionally generating outputs with desired properties by doing Bayesian optimization in latent space of variational auto-encoders. Example application implemented with Keras and GPyOpt.