Contents
Preface vii
Contentsxiii
List of Figures xix
List of Tables xxvii
1 Introduction 1
1.1 Outline 3
1.2 A note on programming 5
1.3 Symbols used throughout the book 6
2 Probability Theory and Classical Statistics 9
2.1 Rules of probability 9
2.2 Probability distributions in general 12
2.2.1 Important quantities in distributions 17
2.2.2 Multivariate distributions 19
2.2.3 Marginal and conditional distributions 23
2.3 Some important distributions in social science 25
2.3.1 The binomial distribution 25
2.3.2 The multinomial distribution 27
2.3.3 The Poisson distribution 28
2.3.4 The normal distribution 29
2.3.5 The multivariate normal distribution30
2.3.6 t and multivariate t distributions 33
2.4 Classical statistics in social science33
2.5 Maximum likelihood estimation 35
2.5.1 Constructing a likelihood function 36
2.5.2 Maximizing a likelihood function 38
2.5.3 Obtaining standard errors 39
2.5.4 A normal likelihood example 41
2.6 Conclusions 44
2.7 Exercises 44
2.7.1 Probability exercises 44
2.7.2 Classical inference exercises 45
3 Basics of Bayesian Statistics 47
3.1 Bayes’ Theorem for point probabilities 47
3.2 Bayes’ Theorem applied to probability distributions 50
3.2.1 Proportionality 51
3.3 Bayes’ Theorem with distributions: A voting example 53
3.3.1 Specification of a prior: The beta distribution 54
3.3.2 An alternative model for the polling data: A gamma
prior/ Poisson likelihood approach 60
3.4 A normal prior–normal likelihood example with σ2 known 62
3.4.1 Extending the normal distribution example 65
3.5 Some useful prior distributions 68
3.5.1 The Dirichlet distribution 69
3.5.2 The inverse gamma distribution 69
3.5.3 Wishart and inverse Wishart distributions 70
3.6 Criticism against Bayesian statistics 70
3.7 Conclusions 73
3.8 Exercises 74
4 Modern Model Estimation Part 1: Gibbs Sampling 77
4.1 What Bayesians want and why 77
4.2 The logic of sampling from posterior densities 78
4.3 Two basic sampling methods 80
4.3.1 The inversion method of sampling 81
4.3.2 The rejection method of sampling 84
4.4 Introduction to MCMC sampling 88
4.4.1 Generic Gibbs sampling 88
4.4.2 Gibbs sampling example using the inversion method 89
4.4.3 Example repeated using rejection sampling 93
4.4.4 Gibbs sampling from a real bivariate density 96
4.4.5 Reversing the process: Sampling the parameters given the data100
4.5 Conclusions 103
4.6 Exercises 105
5 Modern Model Estimation Part 2: Metroplis–Hastings
Sampling 107
5.1 A generic MH algorithm108
5.1.1 Relationship between Gibbs and MH sampling 113
5.2 Example: MH sampling when conditional densities are difficult to derive 115
5.3 Example: MH sampling for a conditional density with an unknown form 118
5.4 Extending the bivariate normal example: The full multiparameter model 121
5.4.1 The conditionals for μx and μy 122
5.4.2 The conditionals for σ2x, σ2y, and ρ 123
5.4.3 The complete MH algorithm 124
5.4.4 A matrix approach to the bivariate normal distribution problem126
5.5 Conclusions 128
5.6 Exercises 129