Pathfinder Variational Inference#

Pathfinder [Zhang et al., 2021] is a variational inference algorithm that produces samples from the posterior of a Bayesian model. It compares favorably to the widely used ADVI algorithm. On large problems, it should scale better than most MCMC algorithms, including dynamic HMC (i.e. NUTS), at the cost of a more biased estimate of the posterior. For details on the algorithm, see the arxiv preprint.

This algorithm is implemented in BlackJAX, a library of inference algorithms for JAX. Through PyMC’s JAX-backend (through aesara) we can run BlackJAX’s pathfinder on any PyMC model with some simple wrapper code.

This wrapper code is implemented in pymcx. This tutorial shows how to run Pathfinder on your PyMC model.

You first need to install pymcx:

pip install git+https://github.com/pymc-devs/pymcx

import arviz as az
import numpy as np
import pymc as pm
import pymcx as pmx

print(f"Running on PyMC v{pm.__version__}")
Running on PyMC v4.2.0

First, define your PyMC model. Here, we use the 8-schools model.

# Data of the Eight Schools Model
J = 8
y = np.array([28.0, 8.0, -3.0, 7.0, -1.0, 1.0, 18.0, 12.0])
sigma = np.array([15.0, 10.0, 16.0, 11.0, 9.0, 11.0, 10.0, 18.0])

with pm.Model() as model:
    mu = pm.Normal("mu", mu=0.0, sigma=10.0)
    tau = pm.HalfCauchy("tau", 5.0)

    theta = pm.Normal("theta", mu=0, sigma=1, shape=J)
    theta_1 = mu + tau * theta
    obs = pm.Normal("obs", mu=theta, sigma=sigma, shape=J, observed=y)

Next, we call pmx.fit() and pass in the algorithm we want it to use.

with model:
    idata = pmx.fit(method="pathfinder")
/Users/twiecki/miniforge3/envs/pymc4/lib/python3.10/site-packages/pymc/sampling_jax.py:37: UserWarning: This module is experimental.
  warnings.warn("This module is experimental.")
Running pathfinder...
Transforming variables...

Just like pymc.sample(), this returns an idata with samples from the posterior. Note that because these samples do not come from an MCMC chain, convergence can not be assessed in the regular way.

az.plot_trace(idata);
../_images/e088dd99486f8164f052653b892385fc684adf34cdfd25800d324d29bd47f1a7.png

References#

1

Lu Zhang, Bob Carpenter, Andrew Gelman, and Aki Vehtari. Pathfinder: parallel quasi-newton variational inference. arXiv preprint arXiv:2108.03782, 2021.

Authors#

Watermark#

%load_ext watermark
%watermark -n -u -v -iv -w -p aesara,xarray
Last updated: Fri Sep 30 2022

Python implementation: CPython
Python version       : 3.10.6
IPython version      : 8.4.0

aesara: 2.8.2
xarray: 2022.6.0

pymc_experimental: 0.0.1
arviz            : 0.12.1
pymc             : 4.2.0
numpy            : 1.22.4

Watermark: 2.3.1

License notice#

All the notebooks in this example gallery are provided under the MIT License which allows modification, and redistribution for any use provided the copyright and license notices are preserved.

Citing PyMC examples#

To cite this notebook, use the DOI provided by Zenodo for the pymc-examples repository.

Important

Many notebooks are adapted from other sources: blogs, books… In such cases you should cite the original source as well.

Also remember to cite the relevant libraries used by your code.

Here is an citation template in bibtex:

@incollection{citekey,
  author    = "<notebook authors, see above>"
  title     = "<notebook title>",
  editor    = "PyMC Team",
  booktitle = "PyMC examples",
  doi       = "10.5281/zenodo.5654871"
}

which once rendered could look like:

  • Thomas Wiecki . "Pathfinder Variational Inference". In: PyMC Examples. Ed. by PyMC Team. DOI: 10.5281/zenodo.5654871