Experiments in machine learning
File | Remarks |
---|---|
ae.py | Simple autoencoder tutorial |
gibbs.py | A generic Gibbs sampler, based on gibbs.py |
hopfield.py | Hopfield network |
learn.bib | Bibliography |
learn.wpr | Project file for Wing IDE |
File | Remarks |
---|---|
bayes1.py | Simple demo for pymc3-Code from Estimating Probabilities with Bayesian Modeling in Python |
gibbs.py | Bayesian Inference: Gibbs Sampling |
mhast.py | Bayesian Inference: Metropolis-Hastings |
naive.py | Code from How to Develop a Naive Bayes Classifier from Scratch in Python |
UGMM.py | CAVI code snarfed from Zhiya Zuo |
File | Remarks |
---|---|
LeNet5.py | LeNet-5 CNN in keras |
losses.R | Plot loss and accuracy for Training and Validation data from logfiles |
tf1.py | Tensorflow Tutorial |
tf2.py | Modification of tf1 to use Convolutional layer |
File | Remarks |
---|---|
torch-nn.py | train |
Programs written to understand Variational Inference, based on the following references:
- David M. Blei, Alp Kucukelbir & Jon D. McAuliffe (2017)--Variational Inference: A Review for Statisticians
- Padhraic Smyth--Notes on the EM Algorithm for Gaussian Mixtures: CS 274A, Probabilistic Learning
File | Remarks |
---|---|
CAVI.tex | Doco for VI programs |
cavi1.py | CAVI for Univariate Gaussian from Univariate Gaussian Example |
cavi3.py | The Coordinate Ascent Mean-Field Variational Inference (CAVI) example from Section 3 of Blei et al |
cavi.py | The Coordinate Ascent Mean-Field Variational Inference (CAVI) example from Section 3 of Blei et al |
em.py | Expectation Maximization |
gmm.py | Generate data in accordance with Gaussian Mixture Model |
motifs.py | Gibbs sampler for finding motifs--Implement GibbsSampler |
Programs based on A tutorial on the free-energy framework for modelling perception and learning, by Rafal Bogacz
File | Remarks |
---|---|
feex1.py | Exercise 1--posterior probabilities |
feex2.py | Exercise 2--most likely size |
feex3.py | Exercise 3--neural implementation |
feex5.py | Exercise 5--learn variance |