syiswell / MI_bounds_pytorch

Pytorch implementation of variational lower bounds on Mutual Information

Geek Repo:Geek Repo

Github PK Tool:Github PK Tool

Pytorch implementation of several variational lower bounds on mutual information, optimized with Neural Networks


All those bounds were derived from the work of Ben Poole and al., On Variational Bounds of Mutual Information, 2019.

Their TensorFlow implementation, as well as useful insights, are available on their GitHub

In the jupyter notebook MI_bounds_pytorch.ipynb you'll find implementation and example with toy data of the following variational lower bounds on MI :

  • NWJ / Mine-f lower bound. High variance, low bias. ref
  • Noise Contrastive Estimation based (infoNCE bound). Low variance, high bias. ref
  • Interpolated bound, control of bias-variance trade-off. ref

General guidelines from Ben Poole and al. :

  • For representation learning purpose, use infoNCE lower bound
  • For MI estimation purpose, use interpolated bound with a low alpha

About

Pytorch implementation of variational lower bounds on Mutual Information


Languages

Language:Jupyter Notebook 100.0%