# pymc.Simulator#

class pymc.Simulator(name, fn, *unnamed_params, params=None, distance='gaussian', sum_stat='identity', epsilon=1, ndim_supp=0, ndims_params=None, dtype='floatX', **kwargs)[source]#

Simulator distribution, used for Approximate Bayesian Inference (ABC) with Sequential Monte Carlo (SMC) sampling via pm.sample_smc.

Simulator distributions have a stochastic pseudo-loglikelihood defined by a distance metric between the observed and simulated data, and tweaked by a hyper-parameter epsilon.

Parameters
fn: callable

Python random simulator function. Should expect the following signature (rng, arg1, arg2, ... argn, size), where rng is a numpy.random.RandomStream() and size defines the size of the desired sample.

params: list

Parameters used by the Simulator random function. Parameters can also be passed by order, in which case the keyword argument params is ignored. Alternatively, each parameter can be passed by order after fn, param1, param2, ..., paramN

distanceAesara Op, callable() or str

Distance function. Available options are "gaussian" (default), "laplace", "kullback_leibler" or a user defined function (or Aesara Op) that takes epsilon, the summary statistics of observed_data and the summary statistics of simulated_data as input.

gaussian: $$-0.5 \left(\left(\frac{xo - xs}{\epsilon}\right)^2\right)$$

laplace: $$-{\left(\frac{|xo - xs|}{\epsilon}\right)}$$

kullback_leibler: $$\frac{d}{n} \frac{1}{\epsilon} \sum_i^n -\log \left( \frac{{\nu_d}_i}{{\rho_d}_i} \right) + \log_r$$ [1]

distance="gaussian" + sum_stat="sort" is equivalent to the 1D 2-wasserstein distance

distance="laplace" + sum_stat="sort" is equivalent to the the 1D 1-wasserstein distance

sum_stat: Aesara Op, callable or str

Summary statistic function. Available options are "indentity" (default), "sort", "mean", "median". If a callable (or Aesara Op) is defined, it should return a 1d numpy array (or Aesara vector).

epsilon: float or array

Scaling parameter for the distance functions. It should be a float or an array of the same size of the output of sum_stat. Defaults to 1.0

ndim_suppint

Number of dimensions of the SimulatorRV (0 for scalar, 1 for vector, etc.) Defaults to 0.

ndims_params: list[int]

Number of minimum dimensions of each parameter of the RV. For example, if the Simulator accepts two scalar inputs, it should be [0, 0]. Defaults to 0 for each parameter.

References

1

Pérez-Cruz, F. (2008, July). Kullback-Leibler divergence estimation of continuous distributions. In 2008 IEEE international symposium on information theory (pp. 1666-1670). IEEE. link

Examples

def simulator_fn(rng, loc, scale, size):
return rng.normal(loc, scale, size=size)

with pm.Model() as m:
loc = pm.Normal("loc", 0, 1)
scale = pm.HalfNormal("scale", 1)
simulator = pm.Simulator("simulator", simulator_fn, loc, scale, observed=data)
idata = pm.sample_smc()


Methods

 Simulator.__init__(*args, **kwargs) Simulator.dist(*params, **kwargs) Creates a tensor variable corresponding to the cls distribution. Simulator.logp(value, sim_op, sim_inputs) Simulator.moment(rv, *sim_inputs)

Attributes

 rv_class rv_op