# Posts in explanation

## Bayes Factors and Marginal Likelihood

- 01 June 2022
- Category: beginner, explanation

The “Bayesian way” to compare models is to compute the *marginal likelihood* of each model \(p(y \mid M_k)\), *i.e.* the probability of the observed data \(y\) given the \(M_k\) model. This quantity, the marginal likelihood, is just the normalizing constant of Bayes’ theorem. We can see this if we write Bayes’ theorem and make explicit the fact that all inferences are model-dependant.

## Approximate Bayesian Computation

- 31 May 2022
- Category: beginner, explanation

Approximate Bayesian Computation methods (also called likelihood free inference methods), are a group of techniques developed for inferring posterior distributions in cases where the likelihood function is intractable or costly to evaluate. This does not mean that the likelihood function is not part of the analysis, it just the we are approximating the likelihood, and hence the name of the ABC methods.

## Regression discontinuity design analysis

- 01 April 2022
- Category: beginner, explanation

Quasi experiments involve experimental interventions and quantitative measures. However, quasi-experiments do *not* involve random assignment of units (e.g. cells, people, companies, schools, states) to test or control groups. This inability to conduct random assignment poses problems when making causal claims as it makes it harder to argue that any difference between a control and test group are because of an intervention and not because of a confounding factor.

## Bayesian Additive Regression Trees: Introduction

- 21 December 2021
- Category: intermediate, explanation

Bayesian additive regression trees (BART) is a non-parametric regression approach. If we have some covariates \(X\) and we want to use them to model \(Y\), a BART model (omitting the priors) can be represented as: