pymc.ASVGD.fit#

ASVGD.fit(n=10000, score=None, callbacks=None, progressbar=True, progressbar_theme=<rich.theme.Theme object>, obj_n_mc=500, **kwargs)[source]#

Perform Operator Variational Inference.

Parameters:
nint

number of iterations

scorebool

evaluate loss on each iteration or not

callbackslist[function: (Approximation, losses, i) -> None]

calls provided functions after each iteration step

progressbarbool

whether to show progressbar or not

progressbar_themeTheme

Custom theme for the progress bar

Returns:
Approximation
Other Parameters:
obj_n_mc: int

Number of monte carlo samples used for approximation of objective gradients

tf_n_mc: `int`

Number of monte carlo samples used for approximation of test function gradients

obj_optimizer: function (grads, params) -> updates

Optimizer that is used for objective params

test_optimizer: function (grads, params) -> updates

Optimizer that is used for test function params

more_obj_params: `list`

Add custom params for objective optimizer

more_tf_params: `list`

Add custom params for test function optimizer

more_updates: `dict`

Add custom updates to resulting updates

total_grad_norm_constraint: `float`

Bounds gradient norm, prevents exploding gradient problem

fn_kwargs: `dict`

Add kwargs to pytensor.function (e.g. {‘profile’: True})

more_replacements: `dict`

Apply custom replacements before calculating gradients