pymc.NUTS#

class pymc.NUTS(*args, **kwargs)[source]#

A sampler for continuous variables based on Hamiltonian mechanics.

NUTS automatically tunes the step size and the number of steps per sample. A detailed description can be found at [1], “Algorithm 6: Efficient No-U-Turn Sampler with Dual Averaging”.

NUTS provides a number of statistics that can be accessed with trace.get_sampler_stats:

  • mean_tree_accept: The mean acceptance probability for the tree that generated this sample. The mean of these values across all samples but the burn-in should be approximately target_accept (the default for this is 0.8).

  • diverging: Whether the trajectory for this sample diverged. If there are any divergences after burnin, this indicates that the results might not be reliable. Reparametrization can often help, but you can also try to increase target_accept to something like 0.9 or 0.95.

  • energy: The energy at the point in phase-space where the sample was accepted. This can be used to identify posteriors with problematically long tails. See below for an example.

  • energy_change: The difference in energy between the start and the end of the trajectory. For a perfect integrator this would always be zero.

  • max_energy_change: The maximum difference in energy along the whole trajectory.

  • depth: The depth of the tree that was used to generate this sample

  • tree_size: The number of leafs of the sampling tree, when the sample was accepted. This is usually a bit less than 2 ** depth. If the tree size is large, the sampler is using a lot of leapfrog steps to find the next sample. This can for example happen if there are strong correlations in the posterior, if the posterior has long tails, if there are regions of high curvature (“funnels”), or if the variance estimates in the mass matrix are inaccurate. Reparametrisation of the model or estimating the posterior variances from past samples might help.

  • tune: This is True, if step size adaptation was turned on when this sample was generated.

  • step_size: The step size used for this sample.

  • step_size_bar: The current best known step-size. After the tuning samples, the step size is set to this value. This should converge during tuning.

  • model_logp: The model log-likelihood for this sample.

  • process_time_diff: The time it took to draw the sample, as defined by the python standard library time.process_time. This counts all the CPU time, including worker processes in BLAS and OpenMP.

  • perf_counter_diff: The time it took to draw the sample, as defined by the python standard library time.perf_counter (wall time).

  • perf_counter_start: The value of time.perf_counter at the beginning of the computation of the draw.

  • index_in_trajectory: This is usually only interesting for debugging purposes. This indicates the position of the posterior draw in the trajectory. Eg a -4 would indicate that the draw was the result of the fourth leapfrog step in negative direction.

  • largest_eigval and smallest_eigval: Experimental statistics for some mass matrix adaptation algorithms. This is nan if it is not used.

References

[1]

Hoffman, Matthew D., & Gelman, Andrew. (2011). The No-U-Turn Sampler: Adaptively Setting Path Lengths in Hamiltonian Monte Carlo.

Methods

NUTS.__init__([vars, max_treedepth, ...])

Set up the No-U-Turn sampler.

NUTS.astep(q0)

Perform a single HMC iteration.

NUTS.competence(var, has_grad)

Check how appropriate this class is for sampling a random variable.

NUTS.reset([start])

NUTS.reset_tuning([start])

NUTS.step(point)

Perform a single step of the sampler.

NUTS.stop_tuning()

Attributes

default_blocked

name

stats_dtypes

A list containing <=1 dictionary that maps stat names to dtypes.

stats_dtypes_shapes

Maps stat names to dtypes and shapes.

vars

Variables that the step method is assigned to.