Pathfinder Variational Inference#
Pathfinder [Zhang et al., 2021] is a variational inference algorithm that produces samples from the posterior of a Bayesian model. It compares favorably to the widely used ADVI algorithm. On large problems, it should scale better than most MCMC algorithms, including dynamic HMC (i.e. NUTS), at the cost of a more biased estimate of the posterior. For details on the algorithm, see the arxiv preprint.
This algorithm is implemented in BlackJAX, a library of inference algorithms for JAX. Through PyMC’s JAX-backend (through pytensor) we can run BlackJAX’s pathfinder on any PyMC model with some simple wrapper code.
This wrapper code is implemented in pymc-experimental. This tutorial shows how to run Pathfinder on your PyMC model.
You first need to install pymc-experimental
:
pip install git+https://github.com/pymc-devs/pymc-experimental
Instructions for installing other packages:
import arviz as az
import matplotlib.pyplot as plt
import numpy as np
import pymc as pm
import pymc_experimental as pmx
print(f"Running on PyMC v{pm.__version__}")
Running on PyMC v5.15.1+17.g508a1341f
First, define your PyMC model. Here, we use the 8-schools model.
# Data of the Eight Schools Model
J = 8
y = np.array([28.0, 8.0, -3.0, 7.0, -1.0, 1.0, 18.0, 12.0])
sigma = np.array([15.0, 10.0, 16.0, 11.0, 9.0, 11.0, 10.0, 18.0])
with pm.Model() as model:
mu = pm.Normal("mu", mu=0.0, sigma=10.0)
tau = pm.HalfCauchy("tau", 5.0)
z = pm.Normal("z", mu=0, sigma=1, shape=J)
theta = mu + tau * z
obs = pm.Normal("obs", mu=theta, sigma=sigma, shape=J, observed=y)
Next, we call pmx.fit()
and pass in the algorithm we want it to use.
with model:
idata = pmx.fit(method="pathfinder", num_samples=1000)
Running pathfinder...
Transforming variables...
Just like pymc.sample()
, this returns an idata with samples from the posterior. Note that because these samples do not come from an MCMC chain, convergence can not be assessed in the regular way.
az.plot_trace(idata)
plt.tight_layout();
References#
Lu Zhang, Bob Carpenter, Andrew Gelman, and Aki Vehtari. Pathfinder: parallel quasi-newton variational inference. arXiv preprint arXiv:2108.03782, 2021.
Watermark#
%load_ext watermark
%watermark -n -u -v -iv -w -p xarray
Last updated: Wed Jul 17 2024
Python implementation: CPython
Python version : 3.11.9
IPython version : 8.25.0
xarray: 2024.6.0
numpy : 1.26.4
matplotlib : 3.8.4
arviz : 0.18.0
pymc_experimental: 0.1.1
pymc : 5.15.1+17.g508a1341f
Watermark: 2.4.3
License notice#
All the notebooks in this example gallery are provided under the MIT License which allows modification, and redistribution for any use provided the copyright and license notices are preserved.
Citing PyMC examples#
To cite this notebook, use the DOI provided by Zenodo for the pymc-examples repository.
Important
Many notebooks are adapted from other sources: blogs, books… In such cases you should cite the original source as well.
Also remember to cite the relevant libraries used by your code.
Here is an citation template in bibtex:
@incollection{citekey,
author = "<notebook authors, see above>",
title = "<notebook title>",
editor = "PyMC Team",
booktitle = "PyMC examples",
doi = "10.5281/zenodo.5654871"
}
which once rendered could look like: