tbrx / compiled-inference

Train neural networks to use as SMC and importance sampling proposals

Geek Repo:Geek Repo

Github PK Tool:Github PK Tool

Learn proposal distributions for importance sampling and SMC

This code provides a PyTorch implementation of the method described in this paper for training neural networks which can be used as proposal distributions for importance sampling or sequential Monte Carlo:

Paige, B., & Wood, F. (2016). Inference Networks for Sequential Monte Carlo in Graphical Models. In Proceedings of the 33rd International Conference on Machine Learning. JMLR W&CP 48: 3040-3049.

The largest section of re-usable code is an implementation of a conditional variant of MADE as a PyTorch module, found in learn_smc_proposals.cde. This can be used to fit a conditional density estimator. There is a version for real-valued data and a version for binary data.

  • The linear regression notebook provides an end-to-end usage example. This notebook defines a generative model using PyMC, a non-conjugate regression model. It then goes through the process of defining a network which will represent the inverse, training it on samples from the joint distribution, and then using it for inference.

Two more involved examples are implemented in learn_smc_proposals.examples; pre-trained weights are included in this repository. Figures and inference are shown in two notebooks:

About

Train neural networks to use as SMC and importance sampling proposals

License:GNU General Public License v3.0


Languages

Language:Jupyter Notebook 97.7%Language:Python 2.3%