## Pathfinder Variational Inference

Pathfinder is a variational inference algorithm that produces samples from the posterior of a Bayesian model. It compares favorably to the widely used ADVI algorithm. On large problems, it should scale better than most MCMC algorithms, including dynamic HMC (i.e. NUTS), at the cost of a more biased estimate of the posterior. For details on the algorithm, see the arxiv preprint.

## Fitting a Reinforcement Learning Model to Behavioral Data with PyMC

Reinforcement Learning models are commonly used in behavioral research to model how animals and humans learn, in situtions where they get to make repeated choices that are followed by some form of feedback, such as a reward or a punishment.

## Gaussian Processes using numpy kernel

Example of simple Gaussian Process fit, adapted from Stan’s example-models repository.

## How to wrap a JAX function for use in PyMC

top-level ‘substitutions’ key is deprecated, place under ‘myst’ key instead [myst.topmatter]

## Factor analysis

top-level ‘substitutions’ key is deprecated, place under ‘myst’ key instead [myst.topmatter]

## Dirichlet mixtures of multinomials

This example notebook demonstrates the use of a Dirichlet mixture of multinomials (a.k.a Dirichlet-multinomial or DM) to model categorical count data. Models like this one are important in a variety of areas, including natural language processing, ecology, bioinformatics, and more.

The Dirichlet process is a flexible probability distribution over the space of distributions. Most generally, a probability distribution, $$P$$, on a set $$\Omega$$ is a [measure](https://en.wikipedia.org/wiki/Measure_(mathematics%29) that assigns measure one to the entire space ($$P(\Omega) = 1$$). A Dirichlet process $$P \sim \textrm{DP}(\alpha, P_0)$$ is a measure that has the property that, for every finite disjoint partition $$S_1, \ldots, S_n$$ of $$\Omega$$,