PHYS366: Statistical Methods in Astrophysics

Lesson 4: Inference in Practice: Sampling Techniques

Goals for this session:

  • Brainstorm, implement, and test methods of accelerating the convergence of Metropolis-based MCMC.
  • Explore the common and uncommon failure modes of MCMC.
  • Learn about the advanced techniques that address these problems (or don't!), and the available software packages implementing them.

About this session

By now, you know that the main practical challenge of doing Bayesian inference is obtaining samples from the posterior distribution. Today's class is all about the nuts and bolts of sampling from probability distributions that, in general, we don't know very much about ahead of time. We'll start out by playing around with a case where we know the right answer analytically, namely the posterior from fitting a linear model to data with Gaussian uncertainites.

Accelerating Metropolis sampling

Discussion points:

  • What decisions did you have to make before running the sampler, and did they influence the convergence/efficiency?
  • What if anything did you do/change to improve these metrics?
  • Brainstorm: what else can we do to improve the acceptance ratio and convergence rate?

Difficult densities

Think-pair-share: what strategies can/should we employ when approaching a new posterior function that we know nothing about, and which may have nasty features like local maxima and narrow degeneracies?

More advanced sampling techniques