pymc.gp.Latent#
- class pymc.gp.Latent(*, mean_func=<pymc.gp.mean.Zero object>, cov_func=<pymc.gp.cov.Constant object>)[source]#
Latent Gaussian process.
The gp.Latent class is a direct implementation of a GP. No additive noise is assumed. It is called “Latent” because the underlying function values are treated as latent variables. It has a prior method and a conditional method. Given a mean and covariance function the function \(f(x)\) is modeled as,
\[f(x) \sim \mathcal{GP}\left(\mu(x), k(x, x')\right)\]Use the prior and conditional methods to actually construct random variables representing the unknown, or latent, function whose distribution is the GP prior or GP conditional. This GP implementation can be used to implement regression on data that is not normally distributed. For more information on the prior and conditional methods, see their docstrings.
- Parameters:
- mean_func
Mean
, defaultZero
The mean function.
- cov_func2D array_like, or
Covariance
, defaultConstant
The covariance function.
- mean_func
Examples
# A one dimensional column vector of inputs. X = np.linspace(0, 1, 10)[:, None] with pm.Model() as model: # Specify the covariance function. cov_func = pm.gp.cov.ExpQuad(1, ls=0.1) # Specify the GP. The default mean function is `Zero`. gp = pm.gp.Latent(cov_func=cov_func) # Place a GP prior over the function f. f = gp.prior("f", X=X) ... # After fitting or sampling, specify the distribution # at new points with .conditional Xnew = np.linspace(-1, 2, 50)[:, None] with model: fcond = gp.conditional("fcond", Xnew=Xnew)
Methods
Latent.__init__
(*[, mean_func, cov_func])Latent.conditional
(name, Xnew[, given, jitter])Returns the conditional distribution evaluated over new input locations Xnew.
Latent.marginal_likelihood
(name, X, *args, ...)Latent.predict
(Xnew[, point, given, diag, model])Latent.prior
(name, X[, reparameterize, jitter])Returns the GP prior distribution evaluated over the input locations X.
Attributes
X
f