Skip to content
forked from jariasf/GMVAE

Implementation of Gaussian Mixture Variational Autoencoder (GMVAE) for Unsupervised Clustering

License

Notifications You must be signed in to change notification settings

samirbraga/GMVAE

 
 

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

27 Commits
 
 
 
 
 
 
 
 

Repository files navigation

Gaussian Mixture Variational Autoencoder

Tensorflow Pytorch
Open In Colab Open In Colab

Implementation of Gaussian Mixture Variational Autoencoder (GMVAE) for Unsupervised Clustering in PyTorch and Tensorflow. The probabilistic model is based on the model proposed by Rui Shu, which is a modification of the M2 unsupervised model proposed by Kingma et al. for semi-supervised learning. Unlike other implementations that use marginalization for the categorical latent variable, we use the Gumbel-Softmax distribution, resulting in better time complexity because of the reduced number of gradient estimations.

Dependencies

  1. Tensorflow. We tested our method with the 1.13.1 tensorflow version. You can Install Tensorflow by following the instructions on its website: https://www.tensorflow.org/install/pip?lang=python2.
  • Caveat: Tensorflow released the 2.0 version with different changes that will not allow to execute this implementation directly. Check the migration guide for executing this implementation in the 2.0 tensorflow version.
  1. PyTorch. We tested our method with the 1.3.0 pytorch version. You can Install PyTorch by following the instructions on its website: https://pytorch.org/get-started/locally/.

  2. Python 3.6.8. We implemented our method with the 3.6.8 version. Additional libraries include: numpy, scipy and matplotlib.

References

About

Implementation of Gaussian Mixture Variational Autoencoder (GMVAE) for Unsupervised Clustering

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages

  • Python 100.0%