pymc.model.core.Potential#
- pymc.model.core.Potential(name, var, model=None, dims=None)[source]#
Add an arbitrary term to the model log-probability.
- Parameters:
- name
str
Name of the potential variable to be registered in the model.
- vartensor_like
Expression to be added to the model joint logp.
- model
Model
, optional The model object to which the potential function is added. If
None
is provided, the current model in the context stack is used.- dims
str
ortuple
ofstr
, optional Dimension names for the variable.
- name
- Returns:
- vartensor_like
The registered, named model variable.
Warning
Potential terms only influence probability-based sampling, such as
pm.sample
, but not forward sampling likepm.sample_prior_predictive
orpm.sample_posterior_predictive
. A warning is raised when doing forward sampling with models containing Potential terms.Examples
In this example, we define a constraint on
x
to be greater or equal to 0. The statementpm.math.log(pm.math.switch(constraint, 0, 1))
adds either 0 or -inf to the model logp, depending on whether the constraint is met. During sampling, any proposals wherex
is negative will be rejected.import pymc as pm with pm.Model() as model: x = pm.Normal("x", mu=0, sigma=1) constraint = x >= 0 potential = pm.Potential("x_constraint", pm.math.log(pm.math.switch(constraint, 1, 0)))
Instead, with a soft constraint like
pm.math.log(pm.math.switch(constraint, 1, 0.5))
, the sampler will be less likely, but not forbidden, from accepting negative values for x.import pymc as pm with pm.Model() as model: x = pm.Normal("x", mu=0, sigma=1) constraint = x >= 0 potential = pm.Potential("x_constraint", pm.math.log(pm.math.switch(constraint, 1.0, 0.5)))
A Potential term can depend on multiple variables. In the following example, the
soft_sum_constraint
potential encouragesx
andy
to have a small sum. The more the sum deviates from zero, the more negative the penalty value of(-((x + y)**2))
.import pymc as pm with pm.Model() as model: x = pm.Normal("x", mu=0, sigma=10) y = pm.Normal("y", mu=0, sigma=10) soft_sum_constraint = pm.Potential("soft_sum_constraint", -((x + y)**2))
A Potential can be used to define a specific prior term. The following example imposes a power law prior on max_items, under the form
log(1/max_items)
, which penalizes very large values of max_items.import pymc as pm with pm.Model() as model: # p(max_items) = 1 / max_items max_items = pm.Uniform("max_items", lower=1, upper=100) pm.Potential("power_prior", pm.math.log(1/max_items)) n_items = pm.Uniform("n_items", lower=1, upper=max_items, observed=60)
A Potential can be used to define a specific likelihood term. In the following example, a normal likelihood term is added to fixed data. The same result would be obtained by using an observed Normal variable.
import pymc as pm def normal_logp(value, mu, sigma): return -0.5 * ((value - mu) / sigma) ** 2 - pm.math.log(sigma) with pm.Model() as model: mu = pm.Normal("x") sigma = pm.HalfNormal("sigma") data = [0.1, 0.5, 0.9] llike = pm.Potential("llike", normal_logp(data, mu, sigma))