pymc.NormalMixture#

class pymc.NormalMixture(name, w, mu, sigma=None, tau=None, comp_shape=(), **kwargs)[source]#

Normal mixture log-likelihood

\[f(x \mid w, \mu, \sigma^2) = \sum_{i = 1}^n w_i N(x \mid \mu_i, \sigma^2_i)\]

Support

\(x \in \mathbb{R}\)

Mean

\(\sum_{i = 1}^n w_i \mu_i\)

Variance

\(\sum_{i = 1}^n w_i^2 \sigma^2_i\)

Parameters
wtensor_like of float

w >= 0 and w <= 1 the mixture weights

mutensor_like of float

the component means

sigmatensor_like of float

the component standard deviations

tautensor_like of float

the component precisions

comp_shapeshape of the Normal component

notice that it should be different than the shape of the mixture distribution, with the last axis representing the number of components.

Notes

You only have to pass in sigma or tau, but not both.

Examples

n_components = 3

with pm.Model() as gauss_mix:
    μ = pm.Normal(
        "μ",
        mu=data.mean(),
        sigma=10,
        shape=n_components,
        transform=pm.transforms.ordered,
        initval=[1, 2, 3],
    )
    σ = pm.HalfNormal("σ", sigma=10, shape=n_components)
    weights = pm.Dirichlet("w", np.ones(n_components))

    y = pm.NormalMixture("y", w=weights, mu=μ, sigma=σ, observed=data)

Methods

NormalMixture.__init__(*args, **kwargs)

NormalMixture.dist(w, mu[, sigma, tau, ...])