全部版块 我的主页
论坛 计量经济学与统计论坛 五区 计量经济学与统计软件
5720 19
2010-06-13
The Bayesian Choice: From Decision-Theoretic Foundations to Computational Implementation (Springer Texts in Statistics) [Hardcover]
Christian Robert (Author)



Editorial Reviews
Review
From the reviews of the second edition:
SHORT BOOK REVIEWS
"The text reads fluently and beautifully throughout, with light, good-humoured touches that warm the reader without being intrusive. There are many examples and exercises, some of which draw out the essence of work of other authors. Each chapter ends with a "Notes" section containing further brief descriptions of research papers. A reference section lists about eight hundred and sixty references. Each chapter begins with a quotation from "The Wheel of Time" a sequence of books by Robert Jordan. Only a few displays and equations have numbers attached. This is an extremely fine, exceptional text of the highest quality."
ISI Short Book Reviews, April 2002
JOURNAL OF MATHEMATICAL PSYCHOLOGY
"This book is an excellent introduction to Bayesian statistics and decision making. The author does an outstanding job in explicating the Bayesian research program and in discussing how Bayesian statistics differs form fiducial inference and from the Newman-Pearson likelihood approach…The book would be well suited for a graduate-level course in a mathematical statistics department. There are numerous examples and exercises to enhance a deeper understanding of the material. The writing is authoritative, comprehensive, and scholarly."
"This book is a publication in the well-known Springer Series in statistics published in 2001. It is a textbook that presents an introduction to Bayesian statistics and decision theory for graduate level course … . The textbook contains a wealth of references to the literature; therefore it can also be recommended as an important reference book for statistical researchers. … for those who want to make a Bayesian choice, I recommend that you make your choice by getting hold of Robert’s book, The Bayesian Choice." (Jan du Plessis, Newsletter of the South African Statistical Association, June, 2003)
"This is the second edition of the author’s graduate level textbook ‘The Bayesian choice: a decision-theoretic motivation.’ … The present book is a revised edition. It includes important advances that have taken place since then. Different from the previous edition is the decreased emphasis on decision-theoretic principles. Nevertheless, the connection between Bayesian Statistics and Decision Theory is developed. Moreover, the author emphasizes the increasing importance of computational techniques." (Krzysztof Piasecki, Zentralblatt MATH, Vol. 980, 2002)
Product Description
This graduate-level textbook presents an introduction to Bayesian statistics and decision theory. Its scope covers both the basic ideas of statistical theory, and also some of the more modern and advanced topics of Bayesian statistics such as complete class theorems, the Stein effect, Bayesian model choice, hierarchical and empirical Bayes modeling, Monte Carlo integration including MCMC techniques, and Gibbs sampling. The second edition includes a new chapter on model choice (Chapter 7) and the chapter on Bayesian calculations (6) has been extensively revised. Chapter 4 includes a new section on dynamic models. In Chapter 3, the material on noninformative priors has been expanded, and Chapter 10 has been supplemented with more examples. The Bayesian Choice will be suitable as a text for courses on Bayesian analysis, decision theory or a combination of them.




Product Details
  • Hardcover: 577 pages
  • Publisher: Springer; 2nd edition (May 25, 2001)
  • Language: English
  • ISBN-10: 0387952314
  • ISBN-13: 978-0387952314

附件列表

abbr_24d1569cdcf86c2bc8d239593a1e265b.pdf

大小:6.29 MB

只需: 1 个论坛币  马上下载

二维码

扫码加我 拉你入群

请注明:姓名-公司-职位

以便审核进群资格,未注明则拒绝

全部回复
2010-6-13 08:46:24

Contents

Preface to the Paperback Edition vii

Preface to the Second Edition ix

Preface to the First Edition xiii

List of Tables xxiii

List of Figures xxv

1 Introduction 1

1.1 Statistical problems and statistical models 1

1.2 The Bayesian paradigm as a duality principle 8

1.3 Likelihood Principle and Sufficiency Principle 13

1.3.1 Sufficiency 13

1.3.2 The Likelihood Principle 15

1.3.3 Derivation of the Likelihood Principle 18

1.3.4 Implementation of the Likelihood Principle 19

1.3.5 Maximum likelihood estimation 20

1.4 Prior and posterior distributions 22

1.5 Improper prior distributions 26

1.6 The Bayesian choice 31

1.7 Exercises 31

1.8 Notes 45

2 Decision-Theoretic Foundations 51

2.1 Evaluating estimators 51

2.2 Existence of a utility function 54

2.3 Utility and loss 60

2.4 Two optimalities: minimaxity and admissibility 65

2.4.1 Randomized estimators 65

2.4.2 Minimaxity 66

2.4.3 Existence of minimax rules and maximin strategy 69

2.4.4 Admissibility 74

2.5 Usual loss functions 77

2.5.1 The quadratic loss 77

2.5.2 The absolute error loss 79

2.5.3 The 0 1 loss 80

2.5.4 Intrinsic losses 81

2.6 Criticisms and alternatives 83

2.7 Exercises 85

2.8 Notes 96

二维码

扫码加我 拉你入群

请注明:姓名-公司-职位

以便审核进群资格,未注明则拒绝

2010-6-13 08:47:02

3 From Prior Information to Prior Distributions 105

3.1 The difficulty in selecting a prior distribution 105

3.2 Subjective determination and approximations 106

3.2.1 Existence 106

3.2.2 Approximations to the prior distribution 108

3.2.3 Maximum entropy priors 109

3.2.4 Parametric approximations 111

3.2.5 Other techniques 113

3.3 Conjugate priors 113

3.3.1 Introduction 113

3.3.2 Justifications 114

3.3.3 Exponential families 115

3.3.4 Conjugate distributions for exponential families 120

3.4 Criticisms and extensions 123

3.5 Noninformative prior distributions 127

3.5.1 Laplace’s prior 127

3.5.2 Invariant priors 128

3.5.3 The Jeffreys prior 129

3.5.4 Reference priors 133

3.5.5 Matching priors 137

3.5.6 Other approaches 140

3.6 Posterior validation and robustness 141

3.7 Exercises 144

3.8 Notes 158

4 Bayesian Point Estimation 165

4.1 Bayesian inference 165

4.1.1 Introduction 165

4.1.2 MAP estimator 166

4.1.3 Likelihood Principle 167

4.1.4 Restricted parameter space 168

4.1.5 Precision of the Bayes estimators 170

4.1.6 Prediction 171

4.1.7 Back to Decision Theory 173

4.2 Bayesian Decision Theory 173

4.2.1 Bayes estimators 173

4.2.2 Conjugate priors 175

4.2.3 Loss estimation 178

4.3 Sampling models 180

4.3.1 Laplace succession rule 180

4.3.2 The tramcar problem 181

4.3.3 Capture-recapture models 182

4.4 The particular case of the normal model 186

4.4.1 Introduction 186

4.4.2 Estimation of variance 187

4.4.3 Linear models and G–priors 190

4.5 Dynamic models 193

4.5.1 Introduction 193

4.5.2 The AR model 196

4.5.3 The MA model 198

4.5.4 The ARMA model 201

4.6 Exercises 201

4.7 Notes 216

二维码

扫码加我 拉你入群

请注明:姓名-公司-职位

以便审核进群资格,未注明则拒绝

2010-6-13 08:47:26

5 Tests and Confidence Regions 223

5.1 Introduction 223

5.2 A first approach to testing theory 224

5.2.1 Decision-theoretic testing 224

5.2.2 The Bayes factor 227

5.2.3 Modification of the prior 229

5.2.4 Point-null hypotheses 230

5.2.5 Improper priors 232

5.2.6 Pseudo-Bayes factors 236

5.3 Comparisons with the classical approach 242

5.3.1 UMP and UMPU tests 242

5.3.2 Least favorable prior distributions 245

5.3.3 Criticisms 247

5.3.4 The p-values 249

5.3.5 Least favorable Bayesian answers 250

5.3.6 The one-sided case 254

5.4 A second decision-theoretic approach 256

5.5 Confidence regions 259

5.5.1 Credible intervals 260

5.5.2 Classical confidence intervals 263

5.5.3 Decision-theoretic evaluation of confidence sets 264

5.6 Exercises 267

5.7 Notes 279

6 Bayesian Calculations 285

6.1 Implementation difficulties 285

6.2 Classical approximation methods 293

6.2.1 Numerical integration 293

6.2.2 Monte Carlo methods 294

6.2.3 Laplace analytic approximation 298

6.3 Markov chain Monte Carlo methods 301

6.3.1 MCMC in practice 302

6.3.2 Metropolis–Hastings algorithms 303

6.3.3 The Gibbs sampler 307

6.3.4 Rao–Blackwellization 309

6.3.5 The general Gibbs sampler 311

6.3.6 The slice sampler 315

6.3.7 The impact on Bayesian Statistics 317

6.4 An application to mixture estimation 318

6.5 Exercises 321

6.6 Notes 334

7 Model Choice 343

7.1 Introduction 343

7.1.1 Choice between models 344

7.1.2 Model choice: motives and uses 347

7.2 Standard framework 348

7.2.1 Prior modeling for model choice 348

7.2.2 Bayes factors 350

7.2.3 Schwartz’s criterion 352

7.2.4 Bayesian deviance 354

7.3 Monte Carlo and MCMC approximations 356

7.3.1 Importance sampling 356

7.3.2 Bridge sampling 358

7.3.3 MCMC methods 359

7.3.4 Reversible jump MCMC 363

7.4 Model averaging 366

7.5 Model projections 369

7.6 Goodness-of-fit 374

7.7 Exercises 377

7.8 Notes 386

二维码

扫码加我 拉你入群

请注明:姓名-公司-职位

以便审核进群资格,未注明则拒绝

2010-6-13 08:47:57

8 Admissibility and Complete Classes 391

8.1 Introduction 391

8.2 Admissibility of Bayes estimators 391

8.2.1 General characterizations 391

8.2.2 Boundary conditions 393

8.2.3 Inadmissible generalized Bayes estimators 395

8.2.4 Differential representations 396

8.2.5 Recurrence conditions 398

8.3 Necessary and sufficient admissibility conditions 400

8.3.1 Continuous risks 401

8.3.2 Blyth’s sufficient condition 402

8.3.3 Stein’s necessary and sufficient condition 407

8.3.4 Another limit theorem 407

8.4 Complete classes 409

8.5 Necessary admissibility conditions 412

8.6 Exercises 416

8.7 Notes 425

9 Invariance, Haar Measures, and Equivariant Estimators 427

9.1 Invariance principles 427

9.2 The particular case of location parameters 429

9.3 Invariant decision problems 431

9.4 Best equivariant noninformative distributions 436

9.5 The Hunt–Stein theorem 441

9.6 The role of invariance in Bayesian Statistics 445

9.7 Exercises 446

9.8 Notes 454

10 Hierarchical and Empirical Bayes Extensions 457

10.1 Incompletely Specified Priors 457

10.2 Hierarchical Bayes analysis 460

10.2.1 Hierarchical models 460

10.2.2 Justifications 462

10.2.3 Conditional decompositions 465

10.2.4 Computational issues 468

10.2.5 Hierarchical extensions for the normal model 470

10.3 Optimality of hierarchical Bayes estimators 474

10.4 The empirical Bayes alternative 478

10.4.1 Nonparametric empirical Bayes 479

10.4.2 Parametric empirical Bayes 481

10.5 Empirical Bayes justifications of the Stein effect 484

10.5.1 Point estimation 485

10.5.2 Variance evaluation 487

10.5.3 Confidence regions 488

10.5.4 Comments 490

10.6 Exercises 490

10.7 Notes 502

二维码

扫码加我 拉你入群

请注明:姓名-公司-职位

以便审核进群资格,未注明则拒绝

2010-6-13 08:48:30

11 A Defense of the Bayesian Choice 507

A Probability Distributions 519

A.1 Normal distribution, Np(θ,Σ) 519

A.2 Gamma distribution, G(α, β) 519

A.3 Beta distribution, Be(α, β) 519

A.4 Student’s t-distribution, Tp(ν, θ,Σ) 520

A.5 Fisher’s F-distribution, F(ν, _) 520

A.6 Inverse gamma distribution, IG(α, β) 520

A.7 Noncentral chi-squared distribution, χ2

ν(λ) 520

A.8 Dirichlet distribution, Dk(α1, . . ., αk) 521

A.9 Pareto distribution, Pa(α, x0) 521

A.10 Binomial distribution, B(n, p). 521

A.11 Multinomial distribution, Mk(n; p1, . . . , pk) 521

A.12 Poisson distribution, P(λ) 521

A.13 Negative Binomial distribution, Neg(n, p) 522

A.14 Hypergeometric distribution, Hyp(N; n; p) 522

B Usual Pseudo-random Generators 523

B.1 Normal distribution, N(0, 1) 523

B.2 Exponential distribution, Exp(λ) 523

B.3 Student’s t-distribution, T (ν, 0, 1) 524

B.4 Gamma distribution, G(α, 1) 524

B.5 Binomial distribution, B(n, p) 525

B.6 Poisson distribution, P(λ) 525

C Notations 527

C.1 Mathematical 527

C.2 Probabilistic 528

C.3 Distributional 528

C.4 Decisional 529

C.5 Statistical 529

C.6 Markov chains 530

二维码

扫码加我 拉你入群

请注明:姓名-公司-职位

以便审核进群资格,未注明则拒绝

点击查看更多内容…
相关推荐
栏目导航
热门文章
推荐文章

说点什么

分享

扫码加好友,拉您进群
各岗位、行业、专业交流群