Chapter 57
MARKOV CHAIN MONTE CARLO METHODS: COMPUTATION
AND INFERENCE
SIDDHARTHA CHIB *
John M Olin School of Business, Washington University, Campus Box 1133, 1 Brookings Dr ,
St Louis, MO 63130, USA
Contents
Abstract 3570
Keywords 3570
1 Introduction 3571
1.1 Organization 3573
2 Classical sampling methods 3573
2.1 Inverse transform method 3573
2.2 Accept-reject algorithm 3575
2.3 Method of composition 3576
3 Markov chains 3576
3.1 Definitions and results 3577
3.2 Computation of numerical accuracy and inefficiency factor 3579
4 Metropolis-Hastings algorithm 3580
4.1 The algorithm 3581
4.2 Convergence results 3584
4.3 Example 3585
4.4 Multiple-block M-H algorithm 3587
5 The Gibbs sampling algorithm 3589
5.1 The algorithm 3590
5.2 Connection with the multiple-block M-H algorithm 3591
5.3 Invariance of the Gibbs Markov chain 3592
5.4 Sufficient conditions for convergence 3592
5.5 Estimation of density ordinates 3592
5.6 Example: simulating a truncated multivariate normal 3594
6 Sampler performance and diagnostics 3595
7 Strategies for improving mixing 3596
7.1 Choice of blocking 3597
7.2 Tuning the proposal density 3597
7.3 Other strategies 3598
8 MCMC algorithms in Bayesian estimation 3599
8.1 Overview 3599
8.2 Notation and assumptions 3600
8.3 Normal and student-t regression models 3602
8.4 Binary and ordinal probit 3604
8.5 Tobit censored regression 3607
8.6 Regression with change point 3608
8.7 Autoregressive time series 3610
8.8 Hidden Markov models 3612
8.9 State space models 3614
8.10 Stochastic volatility model 3616
8.11 Gaussian panel data models 3619
8.12 Multivariate binary data models 3620
9 Sampling the predictive density 3623
10 MCMC methods in model choice problems 3626
10.1 Background 3626
10.2 Marginal likelihood computation 3627
10.3 Model space-parameter space MCMC algorithms 3633
10.4 Variable selection 3636
10.5 Remark 3639
11 MCMC methods in optimization problems 3639
12 Concluding remarks 3641
References 3642