# Posted in 2024

## Gaussian Processes: HSGP Advanced Usage

- 28 June 2024

The Hilbert Space Gaussian processes approximation is a low-rank GP approximation that is particularly well-suited to usage in probabilistic programming languages like PyMC. It approximates the GP using a pre-computed and fixed set of basis functions that don’t depend on the form of the covariance kernel or its hyperparameters. It’s a *parametric* approximation, so prediction in PyMC can be done as one would with a linear model via `pm.Data`

or `pm.set_data`

. You don’t need to define the `.conditional`

distribution that non-parameteric GPs rely on. This makes it *much* easier to integrate an HSGP, instead of a GP, into your existing PyMC model. Additionally, unlike many other GP approximations, HSGPs can be used anywhere within a model and with any likelihood function.

## Gaussian Processes: HSGP Reference & First Steps

- 10 June 2024

The Hilbert Space Gaussian processes approximation is a low-rank GP approximation that is particularly well-suited to usage in probabilistic programming languages like PyMC. It approximates the GP using a pre-computed and fixed set of basis functions that don’t depend on the form of the covariance kernel or its hyperparameters. It’s a *parametric* approximation, so prediction in PyMC can be done as one would with a linear model via `pm.Data`

or `pm.set_data`

. You don’t need to define the `.conditional`

distribution that non-parameteric GPs rely on. This makes it *much* easier to integrate an HSGP, instead of a GP, into your existing PyMC model. Additionally, unlike many other GP approximations, HSGPs can be used anywhere within a model and with any likelihood function.

## Categorical regression

- 06 May 2024

In this example, we will model outcomes with more than two categories.

## Automatic marginalization of discrete variables

- 20 January 2024

PyMC is very amendable to sampling models with discrete latent variables. But if you insist on using the NUTS sampler exclusively, you will need to get rid of your discrete variables somehow. The best way to do this is by marginalizing them out, as then you benefit from Rao-Blackwell’s theorem and get a lower variance estimate of your parameters.

## Bayesian Non-parametric Causal Inference

- 06 January 2024

There are few claims stronger than the assertion of a causal relationship and few claims more contestable. A naive world model - rich with tenuous connections and non-sequiter implications is characteristic of conspiracy theory and idiocy. On the other hand, a refined and detailed knowledge of cause and effect characterised by clear expectations, plausible connections and compelling counterfactuals, will steer you well through the buzzing, blooming confusion of the world.

## Baby Births Modelling with HSGPs

- 06 January 2024

This notebook provides an example of using the Hilbert Space Gaussian Process (HSGP) technique, introduced in [Solin and Särkkä, 2020], in the context of time series modeling. This technique has proven successful in speeding up models with Gaussian process components.