yjlolo / GMVAE

Implementation of Gaussian Mixture Variational Autoencoder (GMVAE) for Unsupervised Clustering

Geek Repo:Geek Repo

Github PK Tool:Github PK Tool

Gaussian Mixture Variational Autoencoder

Tensorflow Pytorch
Open In Colab Open In Colab

Implementation of Gaussian Mixture Variational Autoencoder (GMVAE) for Unsupervised Clustering in tensorflow. The model is based on the M2 Unsupervised model proposed by Kingma et al. for semi-supervised learning. Unlike other implementations that use marginalization for the categorical latent variable, we use the Gumbel-Softmax distribution, resulting in better time complexity because of the reduced number of gradient estimations. We modified the M2 generative model to represent a Mixture of Gaussians.

Dependencies

  1. Tensorflow. We tested our method with the 1.13.1 tensorflow version. You can Install Tensorflow by following the instructions on its website: https://www.tensorflow.org/install/pip?lang=python2.
  • Caveat: Tensorflow released the 2.0 version with different changes that will not allow to execute this implementation directly. Check the migration guide for executing this implementation in the 2.0 tensorflow version.
  1. PyTorch. We tested our method with the 1.3.0 pytorch version. You can Install PyTorch by following the instructions on its website: https://pytorch.org/get-started/locally/.

  2. Python 3.6.8. We implemented our method with the 3.6.8 version. Additional libraries include: numpy, scipy and matplotlib.

About

Implementation of Gaussian Mixture Variational Autoencoder (GMVAE) for Unsupervised Clustering

License:MIT License


Languages

Language:Python 100.0%