pymc.gp.HSGP.prior_linearized#

HSGP.prior_linearized(X)[source]#

Linearized version of the HSGP. Returns the Laplace eigenfunctions and the square root of the power spectral density needed to create the GP.

This function allows the user to bypass the GP interface and work with the basis and coefficients directly. This format allows the user to create predictions using pm.set_data similarly to a linear model. It also enables computational speed ups in multi-GP models, since they may share the same basis. The return values are the Laplace eigenfunctions phi, and the square root of the power spectral density. An example is given below.

Parameters:
X: array-like

Function input values.

Returns:
phi: array_like

Either Numpy or PyTensor 2D array of the fixed basis vectors. There are n rows, one per row of Xs and prod(m) columns, one for each basis vector.

sqrt_psd: array_like

Either a Numpy or PyTensor 1D array of the square roots of the power spectral densities.

Examples

# A one dimensional column vector of inputs.
X = np.linspace(0, 10, 100)[:, None]

with pm.Model() as model:
    eta = pm.Exponential("eta", lam=1.0)
    ell = pm.InverseGamma("ell", mu=5.0, sigma=5.0)
    cov_func = eta**2 * pm.gp.cov.ExpQuad(1, ls=ell)

    # m = [200] means 200 basis vectors for the first dimension
    # L = [10] means the approximation is valid from Xs = [-10, 10]
    gp = pm.gp.HSGP(m=[200], L=[10], cov_func=cov_func)

    # Set X as Data so it can be mutated later, and then pass it to the GP
    X = pm.Data("X", X)
    phi, sqrt_psd = gp.prior_linearized(X=X)

    # Specify standard normal prior in the coefficients, the number of which
    # is given by the number of basis vectors, saved in `n_basis_vectors`.
    beta = pm.Normal("beta", size=gp.n_basis_vectors)

    # The (non-centered) GP approximation is given by:
    f = pm.Deterministic("f", phi @ (beta * sqrt_psd))

    # The centered approximation can be more efficient when
    # the GP is stronger than the noise
    # beta = pm.Normal("beta", sigma=sqrt_psd, size=gp.n_basis_vectors)
    # f = pm.Deterministic("f", phi @ beta)

    ...

# Then it works just like a linear regression to predict on new data.
# First mutate the data X,
x_new = np.linspace(-10, 10, 100)
with model:
    pm.set_data({"X": x_new[:, None]})

# and then make predictions for the GP using posterior predictive sampling.
with model:
    ppc = pm.sample_posterior_predictive(idata, var_names=["f"])