# pymc.gp.Marginal#

class pymc.gp.Marginal(*, mean_func=<pymc.gp.mean.Zero object>, cov_func=<pymc.gp.cov.Constant object>)[source]#

Marginal Gaussian process.

The gp.Marginal class is an implementation of the sum of a GP prior and additive noise. It has marginal_likelihood, conditional and predict methods. This GP implementation can be used to implement regression on data that is normally distributed. For more information on the marginal_likelihood, conditional and predict methods, see their docstrings.

Parameters:
mean_funcMean, default Zero

The mean function.

cov_func2D array_like, or Covariance, default Constant

The covariance function.

Examples

# A one dimensional column vector of inputs.
X = np.linspace(0, 1, 10)[:, None]

with pm.Model() as model:
# Specify the covariance function.

# Specify the GP.  The default mean function is Zero.
gp = pm.gp.Marginal(cov_func=cov_func)

# Place a GP prior over the function f.
sigma = pm.HalfCauchy("sigma", beta=3)
y_ = gp.marginal_likelihood("y", X=X, y=y, sigma=sigma)

...

# After fitting or sampling, specify the distribution
# at new points with .conditional
Xnew = np.linspace(-1, 2, 50)[:, None]

with model:
fcond = gp.conditional("fcond", Xnew=Xnew)


Methods

 Marginal.__init__(*[, mean_func, cov_func]) Marginal.conditional(name, Xnew[, ...]) Returns the conditional distribution evaluated over new input locations Xnew. Marginal.marginal_likelihood(name, X, y[, ...]) Returns the marginal likelihood distribution, given the input locations X and the data y. Marginal.predict(Xnew[, point, diag, ...]) Return the mean vector and covariance matrix of the conditional distribution as numpy arrays, given a point, such as the MAP estimate or a sample from a trace. Marginal.prior(name, X, *args, **kwargs)

Attributes

 X sigma y