Posts by Bill Engels
Gaussian Processes: HSGP Advanced Usage
- 28 June 2024
The Hilbert Space Gaussian processes approximation is a low-rank GP approximation that is particularly well-suited to usage in probabilistic programming languages like PyMC. It approximates the GP using a pre-computed and fixed set of basis functions that don’t depend on the form of the covariance kernel or its hyperparameters. It’s a parametric approximation, so prediction in PyMC can be done as one would with a linear model via pm.Data
or pm.set_data
. You don’t need to define the .conditional
distribution that non-parameteric GPs rely on. This makes it much easier to integrate an HSGP, instead of a GP, into your existing PyMC model. Additionally, unlike many other GP approximations, HSGPs can be used anywhere within a model and with any likelihood function.
Gaussian Processes: HSGP Reference & First Steps
- 10 June 2024
The Hilbert Space Gaussian processes approximation is a low-rank GP approximation that is particularly well-suited to usage in probabilistic programming languages like PyMC. It approximates the GP using a pre-computed and fixed set of basis functions that don’t depend on the form of the covariance kernel or its hyperparameters. It’s a parametric approximation, so prediction in PyMC can be done as one would with a linear model via pm.Data
or pm.set_data
. You don’t need to define the .conditional
distribution that non-parameteric GPs rely on. This makes it much easier to integrate an HSGP, instead of a GP, into your existing PyMC model. Additionally, unlike many other GP approximations, HSGPs can be used anywhere within a model and with any likelihood function.
Gaussian Processes: Latent Variable Implementation
- 06 June 2023
The gp.Latent
class is a direct implementation of a Gaussian process without approximation. Given a mean and covariance function, we can place a prior on the function \(f(x)\),
Marginal Likelihood Implementation
- 04 June 2023
The gp.Marginal
class implements the more common case of GP regression: the observed data are the sum of a GP and Gaussian noise. gp.Marginal
has a marginal_likelihood
method, a conditional
method, and a predict
method. Given a mean and covariance function, the function \(f(x)\) is modeled as,
Multi-output Gaussian Processes: Coregionalization models using Hamadard product
- 04 October 2022
This notebook shows how to implement the Intrinsic Coregionalization Model (ICM) and the Linear Coregionalization Model (LCM) using a Hamadard product between the Coregion kernel and input kernels. Multi-output Gaussian Process is discussed in this paper by Bonilla et al. [2007]. For further information about ICM and LCM, please check out the talk on Multi-output Gaussian Processes by Mauricio Alvarez, and his slides with more references at the last page.
Kronecker Structured Covariances
- 04 October 2022
PyMC contains implementations for models that have Kronecker structured covariances. This patterned structure enables Gaussian process models to work on much larger datasets. Kronecker structure can be exploited when
Gaussian Process for CO2 at Mauna Loa
- 04 April 2022
This Gaussian Process (GP) example shows how to:
Mean and Covariance Functions
- 22 March 2022
A large set of mean and covariance functions are available in PyMC. It is relatively easy to define custom mean and covariance functions. Since PyMC uses PyTensor, their gradients do not need to be defined by the user.