djsutherland / igms

Implicit generative models and related stuff based on the MMD, in PyTorch

Geek Repo:Geek Repo

Github PK Tool:Github PK Tool

Implementation of implicit generative models based on the MMD in PyTorch (for Python 3.6+). Very much a work in progress; feel free to get in touch if you want to use it.

Currently contains:

  • igms.featurize: extract features of images from pretrained classifiers (torchvision or the pretrained L2-robust ones from locuslab/smoothing).
  • igms.kernels: various standard kernels, potentially on top of those features, with some fancy machinery to cache various sums and so on.
  • igms.mmd to estimate the MMD, run permutation tests based on the MMD, and unbiasedly estimate the variance of the MMD, from the igms.kernels.LazyKernelPair class.
  • train_gmmn.py: train a Generative Moment Matching Networks (Y. Li+ ICML-15 / Dziugaite+ UAI-15), i.e. just minimize the MMD with a fixed kernel between the model and the target distribution, but using much richer kernels than those papers used.

Soon:

  • Generative Feature Matching Networks (Santos+ 2019), which are basically GMMNs with a moving-average trick to estimate the MMD.
  • FID and KID evaluation (KID is essentially done, just need a wrapper).
  • MMD three-sample tests like Bounliphone+ ICLR-16 (essentially done).
  • Optimized kernel two-sample tests like Sutherland+ ICLR-17 – not really an IGM, but it's basically implemented already.
  • Adaptive IGM learning rates based on the KID three-sample test, like in Bińkowski+ ICLR-18.

Other things I might implement here eventually:

About

Implicit generative models and related stuff based on the MMD, in PyTorch

License:Apache License 2.0


Languages

Language:Jupyter Notebook 96.4%Language:Python 1.9%Language:TeX 1.2%Language:Mathematica 0.6%