GradVI provides tools for Bayesian variational inference using gradient descent methods. The user specifies a prior and a task (e.g. linear regression, trendfiltering), observes data and runs posterior inference. The goal is to learn the parameters of the corresponding variational posterior family.
Currently, two different prior distributions, namely (1) adaptive shrinkage (ASH) prior, and (2) point-normal prior are provided within the software. For any other choice, the user has to define the prior distribution following the examples provided within the framework.
- mr.ash.alpha A coordinate ascent algorithm for multiple linear regression with ASH prior.
- mr-ash-pen A fast FORTRAN core for GradVI multiple regression using ASH prior.
Theory for GradVI: Link to Overleaf
The software can be installed directly from github using pip
:
pip install git+https://github.com/stephenslab/gradvi
For development, download this repository and install using the -e
flag:
git clone https://github.com/stephenslab/gradvi.git # or use the SSH link
cd gradvi
pip install -e .
Functions are not documented yet. Here, I show an example to get started:
Example of Linear regression
Simulate some data:
import numpy as np
from gradvi.priors import Ash
from gradvi.inference import LinearRegression
n = 100
p = 200
pcausal = 20
s2 = 1.4
k = 10
sk = (np.power(2.0, np.arange(k) / k) - 1)
np.random.seed(100)
X = np.random.normal(0, 1, size = n * p).reshape(n, p)
b = np.zeros(p)
b[:pcausal] = np.random.normal(0, 1, size = pcausal)
err = np.random.normal(0, np.sqrt(s2), size = n)
y = np.dot(X, b) + err
Perform regression:
prior = Ash(sk, scaled = True)
gvlin = LinearRegression(debug = False, display_progress = False)
gvlin.fit(X, y, prior)
b_hat = gvlin.coef