pymc.adagrad_window#
- pymc.adagrad_window(loss_or_grads=None, params=None, learning_rate=0.001, epsilon=0.1, n_win=10)[source]#
Returns a function that returns parameter updates. Instead of accumulated estimate, uses running window
- Parameters:
- loss_or_grads: symbolic expression or list of expressions
A scalar loss expression, or a list of gradient expressions
- params: list of shared variables
The variables to generate update expressions for
- learning_rate: float
Learning rate.
- epsilon: float
Offset to avoid zero-division in the normalizer of adagrad.
- n_win: int
Number of past steps to calculate scales of parameter gradients.
- Returns:
OrderedDict
A dictionary mapping each parameter to its update expression