akar5h / Mixture-of-Expert-with-Variational-Autoencoder

Implementing Mixture of Expert using Discrete VAE with Tensorflow

Geek Repo:Geek Repo

Github PK Tool:Github PK Tool

Mixture of Experts using Discrete VAE

All files related to training the models reside in the code directory. To train the DMVAE model from scratch on MNIST dataset, simply run:

python train.py

This will automatically train the model and save the relevant reconstruction and generation plots. The parameters like model, dataset, etc can be controlled via command line arguments. To get a full list of all supported arguments, run:

python train.py --helpshort

Our code makes use of the following libraries:

  • Tensorflow
  • Numpy
  • Sklearn
  • Matplotlib
  • tqdm

About

Implementing Mixture of Expert using Discrete VAE with Tensorflow


Languages

Language:Python 52.9%Language:TeX 47.1%