Skip to content

Bayesian based machine learning implementations (GMM, VAE & conditional VAE).

License

Notifications You must be signed in to change notification settings

ChuaCheowHuan/bayesian_ML

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

44 Commits
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

What's in this repository?

This repository contains implementation of the following Bayesian based machine learning algorithms:

(1) GMM (Gaussian mixture model) clustering

(2) VAE (Variational Autoencoder) with mnist dataset

(3) Conditional VAE (Conditional Variational Autoencoder) with mnist dataset

The Jupyter notebooks are tested on Colab.


GMM

(Gaussian mixture model) clustering with EM (expectation maximization) algorithm & using variational lower bound as a stopping criterion.

The following are implemented:

(1) Function that avoids computing inverse of matrix when computing by solving system of linear equations.

(2) Log sum trick to avoid underflow when multiplying small numbers.

(3) pdf of the Multivariate normal distribution

(4) E-step function of the EM algorithm

(5) M-step function of the EM algorithm

(6) Variational lower bound function

(7) GMM function

(8) Training function for GMM

(9) Scatter plot of clusters (Plot at the bottom shows results of 7 clusters from a dataset of 100 points)

Data:

GMM clusters:


VAE

Variational Autoencoder with mnist dataset

VAE graph:

VAE output of train & validation data:

VAE decoder output with random Gaussian noise, sample from the prior distribution p(t) (Gaussian) and then from the likelihood p(x | t):


Conditional VAE

Conditional VAE decoder output with random Gaussian noise, sample from the prior distribution p(t) (Gaussian) and then from the likelihood p(x | t):