# Posts tagged variational inference

## Pathfinder Variational Inference

Pathfinder [Zhang *et al.*, 2021] is a variational inference algorithm that produces samples from the posterior of a Bayesian model. It compares favorably to the widely used ADVI algorithm. On large problems, it should scale better than most MCMC algorithms, including dynamic HMC (i.e. NUTS), at the cost of a more biased estimate of the posterior. For details on the algorithm, see the arxiv preprint.

## Variational Inference: Bayesian Neural Networks

- 30 May 2022
- Category: intermediate

**Probabilistic Programming**, **Deep Learning** and “**Big Data**” are among the biggest topics in machine learning. Inside of PP, a lot of innovation is focused on making things scale using **Variational Inference**. In this example, I will show how to use **Variational Inference** in PyMC to fit a simple Bayesian Neural Network. I will also discuss how bridging Probabilistic Programming and Deep Learning can open up very interesting avenues to explore in future research.

## GLM: Mini-batch ADVI on hierarchical regression model

- 23 September 2021
- Category: intermediate

Unlike Gaussian mixture models, (hierarchical) regression models have independent variables. These variables affect the likelihood function, but are not random variables. When using mini-batch, we should take care of that.