pymc.NormalMixture#
- class pymc.NormalMixture(name, w, mu, sigma=None, tau=None, **kwargs)[source]#
Normal mixture log-likelihood.
\[f(x \mid w, \mu, \sigma^2) = \sum_{i = 1}^n w_i N(x \mid \mu_i, \sigma^2_i)\]Support
\(x \in \mathbb{R}\)
Mean
\(\sum_{i = 1}^n w_i \mu_i\)
Variance
\(\sum_{i = 1}^n w_i (\sigma^2_i + \mu_i^2) - \left(\sum_{i = 1}^n w_i \mu_i\right)^2\)
- Parameters:
- wtensor_like of
float
w >= 0 and w <= 1 the mixture weights
- mutensor_like of
float
the component means
- sigmatensor_like of
float
the component standard deviations
- tautensor_like of
float
the component precisions
- wtensor_like of
Notes
You only have to pass in sigma or tau, but not both.
Examples
n_components = 3 with pm.Model() as gauss_mix: μ = pm.Normal( "μ", mu=data.mean(), sigma=10, shape=n_components, transform=pm.distributions.transforms.ordered, initval=[1, 2, 3], ) σ = pm.HalfNormal("σ", sigma=10, shape=n_components) weights = pm.Dirichlet("w", np.ones(n_components)) y = pm.NormalMixture("y", w=weights, mu=μ, sigma=σ, observed=data)
Methods
NormalMixture.dist
(w, mu[, sigma, tau])