pymc.variational.operators.KSD#

class pymc.variational.operators.KSD(approx, temperature=1)[source]#

Operator based on Kernelized Stein Discrepancy.

Input: A target distribution with density function \(p(x)\)

and a set of initial particles \(\{x^0_i\}^n_{i=1}\)

Output: A set of particles \(\{x_i\}^n_{i=1}\) that approximates the target distribution.

\[\begin{split}x_i^{l+1} \leftarrow \epsilon_l \hat{\phi}^{*}(x_i^l) \\ \hat{\phi}^{*}(x) = \frac{1}{n}\sum^{n}_{j=1}[k(x^l_j,x) \nabla_{x^l_j} logp(x^l_j)/temp + \nabla_{x^l_j} k(x^l_j,x)]\end{split}\]
Parameters:
approx: :class:`Approximation`

Approximation used for inference

temperature: float

Temperature for Stein gradient

References

  • Qiang Liu, Dilin Wang (2016) Stein Variational Gradient Descent: A General Purpose Bayesian Inference Algorithm arXiv:1608.04471

Methods

KSD.__init__(approx[, temperature])

KSD.apply(f)

Operator itself.

Attributes

T

datalogp

datalogp_norm

has_test_function

inputs

logp

logp_norm

logq

logq_norm

model

require_logq

returns_loss

supports_aevb

varlogp

varlogp_norm