Gibbs Sampling Explained: The Wisdom of Divide and Conquer

When high-dimensional spaces are overwhelming, Gibbs sampling adopts a ‘divide and conquer’ strategy. By utilizing full conditional distributions, it breaks down complex N-dimensional joint sampling into N simple 1-dimensional sampling steps. This article explains its intuition, mathematical proof (Brook’s Lemma), and Python implementation. [Read More]

The Metropolis-Hastings Algorithm: Breaking the Symmetry

The original Metropolis is limited by symmetric proposals, often ‘hitting walls’ at boundaries or getting lost in high dimensions. The MH algorithm introduces the ‘Hastings Correction’, allowing asymmetric proposals (like Langevin dynamics) while maintaining detailed balance, significantly improving efficiency. [Read More]

Monte Carlo Sampling

Understand the core concepts of Monte Carlo: Law of Large Numbers, rejection sampling, importance sampling, variance reduction techniques (antithetic variates, control variates, stratified sampling). [Read More]

Introduction to MCMC

The reason we need MCMC is that many distributions are only known in their unnormalized form, making traditional sampling/integration methods ineffective. By constructing a ‘correct Markov chain’, we can obtain the target distribution from its stationary distribution, meaning the long-term distribution of the trajectory ≈ target distribution. [Read More]