pymc.Potential#

pymc.Potential(name, var, model=None, dims=None)[source]#

Add an arbitrary factor potential to the model likelihood

The Potential function is used to add arbitrary factors (such as constraints or other likelihood components) to adjust the probability density of the model.

Parameters
namestr

Name of the potential variable to be registered in the model.

vartensor_like

Expression to be added to the model joint logp.

modelModel, optional

The model object to which the potential function is added. If None is provided, the current model in the context stack is used.

dimsstr or tuple of str, optional

Dimension names for the variable.

Returns
vartensor_like

The registered, named model variable.

Warning

Potential functions only influence logp-based sampling. Therefore, they are applicable for sampling with pm.sample but not pm.sample_prior_predictive or pm.sample_posterior_predictive.

Examples

Have a look at the following example:

In this example, we define a constraint on x to be greater or equal to 0 via the pm.Potential function. We pass pm.math.log(pm.math.switch(constraint, 1, 0)) as second argument which will return an expression depending on if the constraint is met or not and which will be added to the likelihood of the model. The probablity density that this model produces agrees strongly with the constraint that x should be greater than or equal to 0. All the cases who do not satisfy the constraint are strictly not considered.

with pm.Model() as model:
    x = pm.Normal("x", mu=0, sigma=1)
    y = pm.Normal("y", mu=x, sigma=1, observed=data)
    constraint = x >= 0
    potential = pm.Potential("x_constraint", pm.math.log(pm.math.switch(constraint, 1, 0)))

However, if we use pm.math.log(pm.math.switch(constraint, 1.0, 0.5)) the potential again penalizes the likelihood when constraint is not met but with some deviations allowed. Here, Potential function is used to pass a soft constraint. A soft constraint is a constraint that is only partially satisfied. The effect of this is that the posterior probability for the parameters decreases as they move away from the constraint, but does not become exactly zero. This allows the sampler to generate values that violate the constraint, but with lower probability.

with pm.Model() as model:
    x = pm.Normal("x", mu=0.1, sigma=1)
    y = pm.Normal("y", mu=x, sigma=1, observed=data)
    constraint = x >= 0
    potential = pm.Potential("x_constraint", pm.math.log(pm.math.switch(constraint, 1.0, 0.5)))

In this example, Potential is used to obtain an arbitrary prior. This prior distribution refers to the prior knowledge that the values of max_items are likely to be small rather than being large. The prior probability of max_items is defined using a Potential object with the log of the inverse of max_items as its value. This means that larger values of max_items have a lower prior probability density, while smaller values of max_items have a higher prior probability density. When the model is sampled, the posterior distribution of max_items given the observed value of n_items will be influenced by the power-law prior defined in the Potential object

with pm.Model():
    # p(max_items) = 1 / max_items
    max_items = pm.Uniform("max_items", lower=1, upper=100)
    pm.Potential("power_prior", pm.math.log(1/max_items))

    n_items = pm.Uniform("n_items", lower=1, upper=max_items, observed=60)

In the next example, the soft_sum_constraint potential encourages x and y to have a small sum, effectively adding a soft constraint on the relationship between the two variables. This can be useful in cases where you want to ensure that the sum of multiple variables stays within a certain range, without enforcing an exact value. In this case, the larger the deviation, larger will be the negative value (-((x + y)**2)) which the MCMC sampler will attempt to minimize. However, the sampler might generate values for some small deviations but with lower probability hence this is a soft constraint.

with pm.Model() as model:
    x = pm.Normal("x", mu=0.1, sigma=1)
    y = pm.Normal("y", mu=x, sigma=1, observed=data)
    soft_sum_constraint = pm.Potential("soft_sum_constraint", -((x + y)**2))

The potential value is incorporated into the model log-probability, so it should be -inf (or very negative) when a constraint is violated, so that those draws are rejected. 0 won’t have any effect and positive values will make the proposals more likely to be accepted.