# Posts by Chris Fonnesbeck

## Marginal Likelihood Implementation

- 04 June 2023

The `gp.Marginal`

class implements the more common case of GP regression: the observed data are the sum of a GP and Gaussian noise. `gp.Marginal`

has a `marginal_likelihood`

method, a `conditional`

method, and a `predict`

method. Given a mean and covariance function, the function \(f(x)\) is modeled as,

## Multivariate Gaussian Random Walk

- 02 February 2023

This notebook shows how to fit a correlated time series using multivariate Gaussian random walks (GRWs). In particular, we perform a Bayesian regression of the time series data against a model dependent on GRWs.

## Reparameterizing the Weibull Accelerated Failure Time Model

- 17 January 2023

The previous example notebook on Bayesian parametric survival analysis introduced two different accelerated failure time (AFT) models: Weibull and log-linear. In this notebook, we present three different parameterizations of the Weibull AFT model.

## Bayesian Survival Analysis

- 17 January 2023

Survival analysis studies the distribution of the time to an event. Its applications span many fields across medicine, biology, engineering, and social science. This tutorial shows how to fit and analyze a Bayesian survival model in Python using PyMC.

## Introduction to Variational Inference with PyMC

- 13 January 2023

The most common strategy for computing posterior quantities of Bayesian models is via sampling, particularly Markov chain Monte Carlo (MCMC) algorithms. While sampling algorithms and associated computing have continually improved in performance and efficiency, MCMC methods still scale poorly with data size, and become prohibitive for more than a few thousand observations. A more scalable alternative to sampling is variational inference (VI), which re-frames the problem of computing the posterior distribution as an optimization problem.

## Empirical Approximation overview

- 13 January 2023

For most models we use sampling MCMC algorithms like Metropolis or NUTS. In PyMC we got used to store traces of MCMC samples and then do analysis using them. There is a similar concept for the variational inference submodule in PyMC: *Empirical*. This type of approximation stores particles for the SVGD sampler. There is no difference between independent SVGD particles and MCMC samples. *Empirical* acts as a bridge between MCMC sampling output and full-fledged VI utils like `apply_replacements`

or `sample_node`

. For the interface description, see variational_api_quickstart. Here we will just focus on `Emprical`

and give an overview of specific things for the *Empirical* approximation.

## GLM: Robust Linear Regression

- 10 January 2023

Duplicate implicit target name: “glm: robust linear regression”.

## Analysis of An AR(1) Model in PyMC

- 07 January 2023

Consider the following AR(2) process, initialized in the infinite past: $\( y_t = \rho_0 + \rho_1 y_{t-1} + \rho_2 y_{t-2} + \epsilon_t, \)\( where \)\epsilon_t \overset{iid}{\sim} {\cal N}(0,1)\(. Suppose you'd like to learn about \)\rho\( from a a sample of observations \)Y^T = { y_0, y_1,\ldots, y_T }$.

## Multi-output Gaussian Processes: Coregionalization models using Hamadard product

- 26 October 2022

This notebook shows how to implement the **Intrinsic Coregionalization Model** (ICM) and the **Linear Coregionalization Model** (LCM) using a Hamadard product between the Coregion kernel and input kernels. Multi-output Gaussian Process is discussed in this paper by Bonilla *et al.* [2007]. For further information about ICM and LCM, please check out the talk on Multi-output Gaussian Processes by Mauricio Alvarez, and his slides with more references at the last page.

## A Primer on Bayesian Methods for Multilevel Modeling

- 24 October 2022

Hierarchical or multilevel modeling is a generalization of regression modeling.

## Gaussian Processes using numpy kernel

- 31 July 2022

Example of simple Gaussian Process fit, adapted from Stan’s example-models repository.

## Modeling spatial point patterns with a marked log-Gaussian Cox process

- 31 May 2022

The log-Gaussian Cox process (LGCP) is a probabilistic model of point patterns typically observed in space or time. It has two main components. First, an underlying *intensity* field \(\lambda(s)\) of positive real values is modeled over the entire domain \(X\) using an exponentially-transformed Gaussian process which constrains \(\lambda\) to be positive. Then, this intensity field is used to parameterize a Poisson point process which represents a stochastic mechanism for placing points in space. Some phenomena amenable to this representation include the incidence of cancer cases across a county, or the spatiotemporal locations of crime events in a city. Both spatial and temporal dimensions can be handled equivalently within this framework, though this tutorial only addresses data in two spatial dimensions.

## Gaussian Process for CO2 at Mauna Loa

- 26 April 2022

This Gaussian Process (GP) example shows how to:

## Lasso regression with block updating

- 10 February 2022

Sometimes, it is very useful to update a set of parameters together. For example, variables that are highly correlated are often good to update together. In PyMC block updating is simple. This will be demonstrated using the parameter `step`

of `pymc.sample`

.

## Bayesian Estimation Supersedes the T-Test

- 07 January 2022

Non-consecutive header level increase; H1 to H3 [myst.header]