· Hardcover: 344 pages
· Publisher: Chapman & Hall; 1 edition (August 26, 2009)
· Language: English
· ISBN-10: 142007749X
· ISBN-13: 978-1420077490
abbr_53a59f203098523d73bb6140a30784fc.pdf
大小:2.8 MB
只需: 1 个论坛币 马上下载
Preface xv
1 Introduction 1
1.1 Background . . . . . . . . . . . . . . . . . . . . . . . . 1
1.2 Scope, Aim and Outline . . . . . . . . . . . . . . . . . 6
1.3 Inverse Bayes Formulae (IBF) . . . . . . . . . . . . . . 9
1.3.1 The point-wise, function-wise and sampling IBF 10
1.3.2 Monte Carlo versions of the IBF . . . . . . . . 12
1.3.3 Generalization to the case of three vectors . . . 14
1.4 The Bayesian Methodology . . . . . . . . . . . . . . . 15
1.4.1 The posterior distribution . . . . . . . . . . . . 15
1.4.2 Nuisance parameters . . . . . . . . . . . . . . . 17
1.4.3 Posterior predictive distribution . . . . . . . . 18
1.4.4 Bayes factor . . . . . . . . . . . . . . . . . . . . 20
1.4.5 Marginal likelihood . . . . . . . . . . . . . . . . 21
1.5 The Missing Data Problems . . . . . . . . . . . . . . . 22
1.5.1 Missing data mechanism . . . . . . . . . . . . . 23
1.5.2 Data augmentation (DA) . . . . . . . . . . . . 23
1.5.3 The original DA algorithm . . . . . . . . . . . 24
1.5.4 Connection with the Gibbs sampler . . . . . . 26
1.5.5 Connection with the IBF . . . . . . . . . . . . 28
1.6 Entropy . . . . . . . . . . . . . . . . . . . . . . . . . . 29
1.6.1 Shannon entropy . . . . . . . . . . . . . . . . . 29
1.6.2 Kullback{Leibler divergence . . . . . . . . . . . 30
Problems . . . . . . . . . . . . . . . . . . . . . . . . . 31
2 Optimization, Monte Carlo Simulation and
Numerical Integration 35
2.1 Optimization . . . . . . . . . . . . . . . . . . . . . . . 36
2.1.1 The Newton{Raphson (NR) algorithm . . . . . 36
2.1.2 The expectation{maximization (EM) algorithm 40
2.1.3 The ECM algorithm . . . . . . . . . . . . . . . 47
2.1.4 Minorization{maximization (MM) algorithms . 49
2.2 Monte Carlo Simulation . . . . . . . . . . . . . . . . . 562.2.1 The inversion method . . . . . . . . . . . . . . 56
2.2.2 The rejection method . . . . . . . . . . . . . . 58
2.2.3 The sampling/importance resampling method . 62
2.2.4 The stochastic representation method . . . . . 66
2.2.5 The conditional sampling method . . . . . . . . 70
2.2.6 The vertical density representation method . . 72
2.3 Numerical Integration . . . . . . . . . . . . . . . . . . 75
2.3.1 Laplace approximations . . . . . . . . . . . . . 75
2.3.2 Riemannian simulation . . . . . . . . . . . . . . 77
2.3.3 The importance sampling method . . . . . . . 80
2.3.4 The cross{entropy method . . . . . . . . . . . . 84
Problems . . . . . . . . . . . . . . . . . . . . . . . . . 89
3 Exact Solutions 93
3.1 Sample Surveys with Nonresponse . . . . . . . . . . . 93
3.2 Misclassi¯ed Multinomial Model . . . . . . . . . . . . 95
3.3 Genetic Linkage Model . . . . . . . . . . . . . . . . . . 97
3.4 Weibull Process with Missing Data . . . . . . . . . . . 99
3.5 Prediction Problem with Missing Data . . . . . . . . . 101
3.6 Binormal Model with Missing Data . . . . . . . . . . . 103
3.7 The 2 £ 2 Crossover Trial with Missing Data . . . . . 105
3.8 Hierarchical Models . . . . . . . . . . . . . . . . . . . 108
3.9 Nonproduct Measurable Space (NPMS) . . . . . . . . 109
Problems . . . . . . . . . . . . . . . . . . . . . . . . . 112
4 Discrete Missing Data Problems 117
4.1 The Exact IBF Sampling . . . . . . . . . . . . . . . . 118
4.2 Genetic Linkage Model . . . . . . . . . . . . . . . . . . 119
4.3 Contingency Tables with One Supplemental Margin . 121
4.4 Contingency Tables with Two Supplemental Margins . 123
4.4.1 Neurological complication data . . . . . . . . . 123
4.4.2 MLEs via the EM algorithm . . . . . . . . . . 123
4.4.3 Generation of i.i.d. posterior samples . . . . . . 125
4.5 The Hidden Sensitivity (HS) Model for Surveys with
Two Sensitive Questions . . . . . . . . . . . . . . . . . 126
4.5.1 Randomized response models . . . . . . . . . . 126
4.5.2 Nonrandomized response models . . . . . . . . 127
4.5.3 The nonrandomized hidden sensitivity model . 128
4.6 Zero{In°ated Poisson Model . . . . . . . . . . . . . . . 132
4.7 Changepoint Problems . . . . . . . . . . . . . . . . . . 133
4.7.1 Bayesian formulation . . . . . . . . . . . . . . . 134
4.7.2 Binomial changepoint models . . . . . . . . . . 1374.7.3 Poisson changepoint models . . . . . . . . . . . 139
4.8 Capture{Recapture Model . . . . . . . . . . . . . . . . 145
Problems . . . . . . . . . . . . . . . . . . . . . . . . . 148
5 Computing Posteriors in the EM-Type Structures 155
5.1 The IBF Method . . . . . . . . . . . . . . . . . . . . . 156
5.1.1 The IBF sampling in the EM structure . . . . 156
5.1.2 The IBF sampling in the ECM structure . . . . 163
5.1.3 The IBF sampling in the MCEM structure . . 164
5.2 Incomplete Pro-Post Test Problems . . . . . . . . . . . 165
5.2.1 Motivating example: Sickle cell disease study . 166
5.2.2 Binormal model with missing data and known
variance . . . . . . . . . . . . . . . . . . . . . . 167
5.2.3 Binormal model with missing data and
unknown mean and variance . . . . . . . . . . 168
5.3 Right Censored Regression Model . . . . . . . . . . . . 173
5.4 Linear Mixed Models for Longitudinal Data . . . . . . 176
5.5 Probit Regression Models for Independent
Binary Data . . . . . . . . . . . . . . . . . . . . . . . . 181
5.6 A Probit-Normal GLMM for Repeated Binary Data . 185
5.6.1 Model formulation . . . . . . . . . . . . . . . . 186
5.6.2 An MCEM algorithm without using
the Gibbs sampler at E-step . . . . . . . . . . . 187
5.7 Hierarchical Models for Correlated Binary Data . . . . 195
5.8 Hybrid Algorithms: Combining the IBF Sampler
with the Gibbs Sampler . . . . . . . . . . . . . . . . . 197
5.8.1 Nonlinear regression models . . . . . . . . . . . 198
5.8.2 Binary regression models with t link . . . . . . 199
5.9 Assessing Convergence of MCMC Methods . . . . . . 201
5.9.1 Gelman and Rubin's PSR statistic . . . . . . . 202
5.9.2 The di®erence and ratio criteria . . . . . . . . . 203
5.9.3 The Kullback{Leibler divergence criterion . . . 204
5.10 Remarks . . . . . . . . . . . . . . . . . . . . . . . . . . 204
Problems . . . . . . . . . . . . . . . . . . . . . . . . . 206
6 Constrained Parameter Problems 211
6.1 Linear Inequality Constraints . . . . . . . . . . . . . . 211
6.1.1 Motivating examples . . . . . . . . . . . . . . . 211
6.1.2 Linear transformation . . . . . . . . . . . . . . 212
6.2 Constrained Normal Models . . . . . . . . . . . . . . . 214
6.2.1 Estimation when variances are known . . . . . 214
6.2.2 Estimation when variances are unknown . . . . 2196.2.3 Two examples . . . . . . . . . . . . . . . . . . 222
6.2.4 Discussion . . . . . . . . . . . . . . . . . . . . . 227
6.3 Constrained Poisson Models . . . . . . . . . . . . . . . 228
6.3.1 Simplex restrictions on Poisson rates . . . . . . 228
6.3.2 Data augmentation . . . . . . . . . . . . . . . . 228
6.3.3 MLE via the EM algorithm . . . . . . . . . . . 229
6.3.4 Bayes estimation via the DA algorithm . . . . 230
6.3.5 Life insurance data analysis . . . . . . . . . . . 231
6.4 Constrained Binomial Models . . . . . . . . . . . . . . 233
6.4.1 Statistical model . . . . . . . . . . . . . . . . . 233
6.4.2 A physical particle model . . . . . . . . . . . . 234
6.4.3 MLE via the EM algorithm . . . . . . . . . . . 236
6.4.4 Bayes estimation via the DA algorithm . . . . 239
Problems . . . . . . . . . . . . . . . . . . . . . . . . . 240
7 Checking Compatibility and Uniqueness 241
7.1 Introduction . . . . . . . . . . . . . . . . . . . . . . . . 241
7.2 Two Continuous Conditional Distributions:
Product Measurable Space (PMS) . . . . . . . . . . . 243
7.2.1 Several basic notions . . . . . . . . . . . . . . . 243
7.2.2 A review on existing methods . . . . . . . . . . 244
7.2.3 Two examples . . . . . . . . . . . . . . . . . . 246
7.3 Finite Discrete Conditional Distributions: PMS . . . . 247
7.3.1 The formulation of the problems . . . . . . . . 248
7.3.2 The connection with quadratic optimization
under box constraints . . . . . . . . . . . . . . 248
7.3.3 Numerical examples . . . . . . . . . . . . . . . 250
7.3.4 Extension to more than two dimensions . . . . 253
7.3.5 The compatibility of regression function and
conditional distribution . . . . . . . . . . . . . 255
7.3.6 Appendix: S-plus function (lseb) . . . . . . . . 258
7.3.7 Discussion . . . . . . . . . . . . . . . . . . . . . 258
7.4 Two Conditional Distributions: NPMS . . . . . . . . . 259
7.5 One Marginal and Another Conditional Distribution . 262
7.5.1 A su±cient condition for uniqueness . . . . . . 262
7.5.2 The continuous case . . . . . . . . . . . . . . . 265
7.5.3 The ¯nite discrete case . . . . . . . . . . . . . . 266
7.5.4 The connection with quadratic optimization
under box constraints . . . . . . . . . . . . . . 269
Problems . . . . . . . . . . . . . . . . . . . . . . . . . 271A Basic Statistical Distributions and Stochastic
Processes 273
A.1 Discrete Distributions . . . . . . . . . . . . . . . . . . 273
A.2 Continuous Distributions . . . . . . . . . . . . . . . . 275
A.3 Mixture Distributions . . . . . . . . . . . . . . . . . . 283
A.4 Stochastic Processes . . . . . . . . . . . . . . . . . . . 285
List of Figures 287
List of Tables 290
List of Acronyms 292
List of Symbols 294
References 298
扫码加好友,拉您进群



收藏
