Skip to content
Gaussian mixture models in PyTorch.
Branch: master
Clone or download
Latest commit 5253087 Apr 5, 2019
Permalink
Type Name Latest commit message Commit time
Failed to load latest commit information.
README.md added example Jul 30, 2018
example.png fixed example Jul 30, 2018
example.py Typos Jan 10, 2019
gmm.py Update gmm.py Apr 5, 2019
test.py separated tests for CPU/GPU, formatting Jul 30, 2018

README.md

This repository contains an implementation of a simple Gaussian mixture model (GMM) fitted with Expectation-Maximization in pytorch. The interface closely follows that of sklearn.

Example of a fit via a Gaussian Mixture model.


A new model is instantiated by calling gmm.GaussianMixture(..) and providing as arguments the number of components, as well as the tensor dimension. Note that once instantiated, the model expects tensors in a flattened shape (n, d).

The first step would usually be to fit the model via model.fit(data), then predict with model.predict(data). To reproduce the above figure, just run the provided example.py.

Some sanity checks can be executed by calling python test.py. To fit data on GPUs, ensure that you first call model.cuda().

You can’t perform that action at this time.