全部版块 我的主页
论坛 计量经济学与统计论坛 五区 计量经济学与统计软件
10708 26
2010-05-14
Finite Mixture and Markov Switching Models (Springer Series in Statistics) (Hardcover)
Sylvia Frühwirth-Schnatter (Author)


Beschreibung von buecher.de
The prominence of finite mixture modelling is greater than ever. Many important statistical topics like clustering data, outlier treatment, or dealing with unobserved heterogeneity involve finite mixture models in some way or other. The area of potential applications goes beyond simple data analysis and extends to regression analysis and to non-linear time series analysis using Markov switching models.
For more than the hundred years since Karl Pearson showed in 1894 how to estimate the five parameters of a mixture of two normal distributions using the method of moments, statistical inference for finite mixture models has been a challenge to everybody who deals with them. In the past ten years, very powerful computational tools emerged for dealing with these models which combine a Bayesian approach with recent Monte simulation techniques based on Markov chains. This book reviews these techniques and covers the most recent advances in the field, among them bridge sampling techniques and reversible jump Markov chain Monte Carlo methods.
It is the first time that the Bayesian perspective of finite mixture modelling is systematically presented in book form. It is argued that the Bayesian approach provides much insight in this context and is easily implemented in practice. Although the main focus is on Bayesian inference, the author reviews several frequentist techniques, especially selecting the number of components of a finite mixture model, and discusses some of their shortcomings compared to the Bayesian approach.
The aim of this book is to impart the finite mixture and Markov switching approach to statistical modelling to a wide-ranging community. This includes not only statisticians, but also biologists, economists, engineers, financial agents, market researcher, medical researchers or any other frequent user of statistical models. This book should help newcomers to the field to understand how finite mixture and Markov switching models are formulated, what structures they imply on the data, what they could be used for, and how they are estimated. Researchers familiar with the subject also will profit from reading this book. The presentation is rather informal without abandoning mathematical correctness. Previous notions of Bayesian inference and Monte Carlo simulation are useful but not needed.


Editorial Reviews
Review
From the reviews:
"At first glance, the numerous equations and formulas may seem to be daunting for psychologists with limited statistical background; however, the descriptions and explanations of the various models are actually quite reader friendly (more so than many advanced statistical textbooks). The author has done an excellent job of inviting newcomers to enter the world of mixture models, more impressively, the author did so without sacrificing mathematical and statistical rigor. Mixture models are appealing in many applications in social and psychological studies. This book not only offers a gentle introduction to mixture models but also provides more in depth coverage for those who look beyond the surface. I believe that psychologists who are interested in related models (e.g., latent class models, latent Markov models, and latent class regression models) will benefit greatly from this book. I highly recommend this book to all psychologists who are interested in mixture models." (Hsiu-Ting Yu, PSYCHOMETRIKA—VOL. 74, NO. 3, 559–560 SEPTEMBER 2009)
"The book is impressive in its mathematical and formal correctness, in generality and in details....it would be helfpful as an additional reference among a wider range of available textbooks in the area. [I]t will find many friends among experts and newcomers to the world of mixture models." (Atanu Biswas, Biometrics, Issue 63, September 2007)
"Finite mixture distributions are important for many models. Therefore they constitute a very active field of research. This book gives an up to date overview over the various models of this kind. … The aim of this book is to impart the finite mixture and Markov switching approach to statistical modeling to a wide-ranging community. … For the frequentists, it offers a good opportunity to explore the advantages of the Bayesian approach in the context of mixing models." (Gheorghe Pitis, Zentralblatt MATH, Vol. 1108 (10), 2007)
"Readership: Statisticians, biologists, economists, engineers, financial agents, market researchers, medical researchers or any other frequent user of statistical models. The first nine chapters of the book are concerned with static mixture models, and the last four with Markov switching models. … especially valuable for students, serving to demonstrate how different statistical techniques, which superficially appear to be unrelated, are in fact part of an integrated whole. This book struck me as being particularly clearly written – it is a pleasure to read." (David J. Hand, International Statistical Review, Vol. 75 (2), 2007)
"The book is excellent, giving a most readable overview of the topic of finite mixtures, aimed at a broad readership … . Students will like the text because of the pedagogical writing style; researchers will definitely welcome the broad treatment of the subject. Both will benefit from the extensive and up-to-date bibliography … as well as the well-organized index. No doubt, this book is a valuable addition to the field of statistics and will surely find its rightful place in many a statistician’s library." (Valerie Chavez-Demoulin, Journal of the American Statistical Association, Vol. 104 (485), March, 2009)


Product Description
WINNER OF THE 2007 DEGROOT PRIZE!
The prominence of finite mixture modelling is greater than ever. Many important statistical topics like clustering data, outlier treatment, or dealing with unobserved heterogeneity involve finite mixture models in some way or other. The area of potential applications goes beyond simple data analysis and extends to regression analysis and to non-linear time series analysis using Markov switching models.
For more than the hundred years since Karl Pearson showed in 1894 how to estimate the five parameters of a mixture of two normal distributions using the method of moments, statistical inference for finite mixture models has been a challenge to everybody who deals with them. In the past ten years, very powerful computational tools emerged for dealing with these models which combine a Bayesian approach with recent Monte simulation techniques based on Markov chains. This book reviews these techniques and covers the most recent advances in the field, among them bridge sampling techniques and reversible jump Markov chain Monte Carlo methods.
It is the first time that the Bayesian perspective of finite mixture modelling is systematically presented in book form. It is argued that the Bayesian approach provides much insight in this context and is easily implemented in practice. Although the main focus is on Bayesian inference, the author reviews several frequentist techniques, especially selecting the number of components of a finite mixture model, and discusses some of their shortcomings compared to the Bayesian approach.
The aim of this book is to impart the finite mixture and Markov switching approach to statistical modelling to a wide-ranging community. This includes not only statisticians, but also biologists, economists, engineers, financial agents, market researcher, medical researchers or any other frequent user of statistical models. This book should help newcomers to the field to understand how finite mixture and Markov switching models are formulated, what structures they imply on the data, what they could be used for, and how they are estimated. Researchers familiar with the subject also will profit from reading this book. The presentation is rather informal without abandoning mathematical correctness. Previous notions of Bayesian inference and Monte Carlo simulation are useful but not needed.





Product Details
  • Hardcover: 492 pages
  • Publisher: Springer; 1 edition (August 8, 2006)
  • Language: English
  • ISBN-10: 0387329099
  • ISBN-13: 978-0387329093




content

1 Finite Mixture Modeling 1
2 Statistical Inference for a Finite Mixture Model with Known Number of Components 25
3 Practical Bayesian Inference for a Finite Mixture Model with Known Number of Components 57
4 Statistical Inference for Finite Mixture Models Under Model Specification Uncertainty 99
5 Computational Tools for Bayesian Inference for Finite Mixtures Models Under Model Specification Uncertainty 125
6 Finite Mixture Models with Normal Components 169
7 Data Analysis Based on Finite Mixtures 203
8 Finite Mixtures of Regression Models 241
9 Finite Mixture Models with Nonnormal Components 277
10 Finite Markov Mixture Modeling 301
11 Statistical Inference for Markov Switching Models 319
12 Nonlinear Time Series Analysis Based on Markov Switching Models 357
13 Switching State Space Models 389
附件列表
Cover.PNG

原图尺寸 493.71 KB

Finite Mixture and Markov Switching Models

Finite Mixture and Markov Switching Models

book_matlab_version_2.0.pdf

大小:907.92 KB

 马上下载

Code Book

bayesf_version_1.0.zip

大小:181.28 KB

 马上下载

A manual describing the use of this package contained V1.0

本附件包括:

  • start.m
  • start_book.m
  • SIMUNI.M
  • warn.m
  • simuniform.m
  • matlab12.m
  • marstat.m
  • matlab15.m
  • Countex.m
  • matlab11.m
  • matlab14.m
  • matlab95.m
  • matlab21.m
  • matlab51.m
  • matlab63.m
  • matlab22.m
  • statecount.m
  • moment_mix_poisson.m
  • mcmcstore.m
  • Istation.m
  • plot_biv_normal.m
  • Dirichlog_eye.m
  • Dirichlog_mbclust.m
  • plotsub.m
  • ranwi_eye.m
  • fish_data.m
  • matlab43.m
  • matlab13.m
  • eye_dat.mat
  • Likeli_normal_old.m
  • logprior_mixpoi.m
  • boxplotvar_old.m
  • mixturecdweight.m
  • Plotac.m
  • Plotac_discrete.m
  • Plotconverge.m
  • Likeli_normal.m
  • QINCOL.M
  • matlab_univortrag.m
  • boxplotvar.m
  • compute_em_poi.m
  • matlab_WII.m
  • mixturemomentsnor.m
  • mychol.m
  • plotclass.m
  • mcmcregression.m
  • plotdichte.m
  • Histneu.m
  • Prodgamlog.m
  • qincolmult.m
  • matlab35.m
  • lamb_dat.mat
  • Likeli_poisson.m
  • star_clust_dat.m
  • Raninvwi_neu.m
  • Invwisim.m
  • logprior_mixpoi_hpmarg.m
  • likeli_normult.m
  • mcmc_mix_poi_mean.m
  • matlab64.m
  • make_contan_neu.m
  • matlabxx.m
  • Likeli_poisson_old.m
  • matlab115.m
  • plot_point_process.m
  • qinmatr.M
  • matlab65.m
  • Simstate.m
  • pmultnormlog.m
  • qinmatrmult.M
  • Simstate_student.m
  • mcmc_sim_sst.m
  • mixturemar.m
  • autocovemp.m
  • matlab64all.m
  • matlab_poireg.m
  • mixpoiprior_old.m
  • Pwilog.m
  • Dirichsim.m
  • mcmc_sim_eta.m
  • Pinvwilog_neu.m
  • matlab96.m
  • matlab65all.m
  • Autocov_alt.m
  • Dirichpdflog.m
  • multimix_ipprior.m
  • mcmcmargmom.m
  • mcmcpredsam.m
  • designpoints.m
  • multimix_cdprior.m
  • mcmcbfplot.m
  • mixturepdf.m
  • gdp_us_dat.mat
  • mcmcpredmom.m
  • multimix_cdpost.m
  • multimix_ippost.m
  • prioreval.m
  • mixpoiprior.m
  • mcmcsubseq_alt.m
  • star_clust_dat.mat
  • mcmcput.m
  • matlab91.m
  • moments_test.m
  • mixturecdpar.m
  • contmix.m
  • proddirichpdflog.m
  • autocovneu.m
  • Prodgamsim.m
  • normalpdflog.m
  • mcmcestimate.m
  • Prodgampdflog.m
  • Prodinvgampdflog.m
  • prodnorpdflog.m
  • mixturediag.m
  • plotpred.m
  • matlab93.m
  • matlab81.m
  • mlcf_multimix.m
  • matlab97.m
  • Plottheta.m
  • matlab92.m
  • mcmcic.m
  • matlab61_old.m
  • surfcontmix.m
  • matlab102.m
  • matlab82.m
  • matlab111_noperm.m
  • matlab94.m
  • matlab111_ergodic.m
  • mixpoissonbf.m
  • likelihoodeval.m
  • matlab111.m
  • matlab98.m
  • fullperm.m
  • prodnormultsim.m
  • mcmcclustplot.m
  • matlab83.m
  • normultsim.m
  • simulatestart.m
  • mc.m
  • mlip_multimix.m
  • compute_posteriormode.m
  • mcmcextract.m
  • mcmcaverage.m
  • multimix_start.m
  • mcmcclust.m
  • simstate_ms_old.m
  • matlab103.m
  • compute_prior.m
  • dataget.m
  • matlab61.m
  • simstate_ms.m
  • datamoments.m
  • matlab62.m
  • Iris_data.dat
  • designar.m
  • mixnorprior_old.m
  • mixtureplot.m
  • matlab116.m
  • matlab101.m
  • mcmcsubseq.m
  • posteriorlog.m
  • dataplot.m
  • checkprior.m
  • moments.m
  • simstate_msmult.m
  • polio.html
  • mcmcpreddens.m
  • matlab114.m
  • mixnorprior.m
  • matlab113.m
  • matlab112.m
  • Mlbsall.m
  • mcmc_nor_musig.m
  • dataclass.m
  • mcmcstart.m
  • mcmcpermute.m
  • mixturepoint.m
  • GNP.DAT
  • run_mixture.m
  • simulate.m
  • mcmcsamrep.m
  • posterior.m
  • mcmcdiag.m
  • mcmcplot.m
  • prodnormultpdflog.m
  • polio.mat
  • compute_mixture_old.m
  • prodinvwipdflog.m
  • compute_mixture.m
  • mixnorbf.m
  • mixnorbf_new.m
  • priordefine.m
  • mcmcbf.m
  • highlight.m
  • mixturemcmc.m
  • matlab84.m
  • stabdiagramm.m
  • plotclasscross.m

bayesf_version_2.0.zip

大小:167.21 KB

 马上下载

A manual describing the use of this package contained V2.0

本附件包括:

  • mixturemomentsnor.m
  • iris_data.dat
  • pwilog.m
  • lamb_dat.mat
  • fish_data.m
  • eye_dat.mat
  • qincol.m
  • countex.m
  • simuniform.m
  • simuni.m
  • GNP.DAT
  • dataget.m
  • start_lamb.m
  • start_gdp.m
  • start_fabricfault_negbin.m
  • start_eye.m
  • mcmcplot.m
  • demo_mix_student_Kunknown.m
  • demo_msar_reg_mixeffects.m
  • demo_msar_reg.m
  • demo_msreg_mixeffects.m
  • demo_msreg.m
  • demo_regression_mix_binomial.m
  • demo_mix_binomial.m
  • demo_mix_student.m
  • demo_mix_normal.m
  • mcmcbf.m
  • mixturemcmc.m
  • mcmcstart.m
  • start_gdp_swi.m
  • demo_mixreg_mixeffects.m
  • demo_mixreg.m
  • start_fabricfault_mixed_effects.m
  • dataplot.m
  • start_iris.m
  • start_fishery.m
  • demo_mix_multivariate_student_Kunknown.m
  • demo_mix_multivariate_student.m
  • demo_mix_exponential.m
  • demo_mix_multivariate_normal_Kunknown.m
  • demo_mix_multivariate_normal.m
  • demo_mix_normal_Kunknown.m
  • start_fishery_K4.m
  • start_iris_K3.m
  • start_gdp_marmix.m
  • start_fabricfault.m
  • dataclass.m
  • demo_poisson_mix_reg_mixed_effects.m
  • simulate.m
  • priordefine.m
  • demo_regression_negbin.m
  • mcmcic.m
  • posterior.m
  • mcmcpermute.m
  • mixnorprior.m
  • mcmcestimate.m
  • demo_mix_normal_mu0_Kunknown.m
  • start_fishery_plot.m
  • demo_figure2_1.m
  • mcmcaverage.m
  • mcmcextract.m
  • mcmcclustplot.m
  • mcmcdiag.m
  • plotac_discrete.m
  • datamoments.m
  • demo_msm_multivariate_normal_Kunknown.m
  • plotac.m
  • mixstudprior.m
  • skewn_transform.m
  • likeli_skewstudmult.m
  • mcmcsamrep.m
  • demo_multivariate_skewnormal.m
  • ranpermute.m
  • mcmcput.m
  • mcmcsubseq.m
  • moments.m
  • mcmc_student_df.m
  • mcmc_negbin_df.m
  • stabdiagramm.m
  • prodstudmultpdflog.m
  • plotclasscross.m
  • likeli_multinomial.m
  • mcmcbfplot.m
  • demo_multinomial.m
  • mixglmprior.m
  • prioreval.m
  • mixturepdf.m
  • mixtureplot.m
  • mixturemar.m
  • momentest.m
  • start_eye_plot.m
  • autocov.m
  • mixturepoint.m
  • demo_figure2_2.m
  • mcmcclustsim.m
  • mcmcclust.m
  • mcmcpm.m
  • mc.m
  • invwisim.m
  • statecount.m
  • histneu.m
  • maketab.m
  • qinmatrmult.m
  • mixtureplot_biv.m
  • contmixskewstud.m
  • likeli_skewstudent.m
  • skewn_parameter.m
  • simulate_truncated_normal.m
  • eval_tcdf_skewt.m
  • likeli_skewstudmult_cd.m
  • demo_negbin.m
  • demo_binomial.m
  • prodnormultpdflog.m
  • prodstudmultsim.m
  • studmultsim.m
  • likeli_negbin.m
  • prodbetapdflog.m
  • likeli_binomial.m
  • likelihoodeval.m
  • prodbetasim.m
  • auxmix_initialize_binomial.m
  • auxmix_binomial.m
  • likeli_poisson.m
  • data_fabric_fault.m
  • auxmix_poisson.m
  • auxmix_initialize_poisson.m
  • likeli_skewnormal.m
  • likeli_stumult.m
  • likeli_student.m
  • contmixskewnormal.m
  • likeli_skewnormult.m
  • likeli_normal.m
  • plotclass.m
  • contmix.m
  • marginallikelihood_eval.m
  • contmixstud.m
  • likeli_expon.m
  • likeli_normult.m
  • mcmcpredmom.m
  • mixexpprior.m
  • bridgesampling_se.m
  • chib_se.m
  • mcmcpreddens.m
  • designar.m
  • demo_poisson_mix_reg.m
  • demo_msar.m
  • warn.m
  • mcmcstore.m
  • mcmcmargmom.m
  • compute_mixture.m
  • normultsim.m
  • simstate_ms.m
  • boxplotvar.m
  • proddirichpdflog.m
  • marstat.m
  • demo_poisson_reg_mixed_effects.m
  • demo_poisson_mix_reg_Kfix.m
  • prodnormultsim.m
  • normalpdflog.m
  • prodinvwipdflog.m
  • prodinvgampdflog.m
  • prodgamsim.m
  • dirichsim.m
  • prodnorpdflog.m
  • prodgampdflog.m
  • dirichpdflog.m
  • prodgamlog.m
  • qincolmult.m
  • logprior_mixpoi_hpmarg.m
  • qinmatr.m
  • plotsub.m

errata.pdf

大小:59.79 KB

 马上下载

纠错

Finite Mixture and Markov Switching Models.pdf

大小:5.63 MB

只需: 1 个论坛币  马上下载

Book

二维码

扫码加我 拉你入群

请注明:姓名-公司-职位

以便审核进群资格,未注明则拒绝

全部回复
2010-5-14 22:16:07
非常感谢奋斗
二维码

扫码加我 拉你入群

请注明:姓名-公司-职位

以便审核进群资格,未注明则拒绝

2010-5-15 01:05:42

Contents

1 Finite Mixture Modeling 1

1.1 Introduction 1

1.2 Finite Mixture Distributions 3

1.2.1 Basic Definitions 3

1.2.2 Some Descriptive Features of Finite Mixture Distributions 5

1.2.3 Diagnosing Similarity of Mixture Components 9

1.2.4 Moments of a Finite Mixture Distribution 10

1.2.5 Statistical Modeling Based on Finite Mixture Distributions 11

1.3 Identifiability of a Finite Mixture Distribution 14

1.3.1 Nonidentifiability Due to Invariance to Relabeling the Components 15

1.3.2 Nonidentifiability Due to Potential Overfitting 17

1.3.3 Formal Identifiability Constraints 19

1.3.4 Generic Identifiability 21

2 Statistical Inference for a Finite Mixture Model with
Known Number of Components 25

2.1 Introduction 25

2.2 Classification for Known Component Parameters 26

2.2.1 Bayes Rule for Classifying a Single Observation 26

2.2.2 The Bayes Classifier for a Whole Data Set 27

2.3 Parameter Estimation for Known Allocation 29

2.3.1 The Complete-Data Likelihood Function 29

2.3.2 Complete-Data Maximum Likelihood Estimation 30

2.3.3 Complete-Data Bayesian Estimation of the Component Parameters 31

2.3.4 Complete-Data Bayesian Estimation of the Weights 35

2.4 Parameter Estimation When the Allocations Are Unknown 41

2.4.1 Method of Moments 42

2.4.2 The Mixture Likelihood Function 43

2.4.3 A Helicopter Tour of the Mixture Likelihood Surface for Two Examples 44

2.4.4 Maximum Likelihood Estimation 49

2.4.5 Bayesian Parameter Estimation 53

2.4.6 Distance-Based Methods 54

2.4.7 Comparing Various Estimation Methods 54

二维码

扫码加我 拉你入群

请注明:姓名-公司-职位

以便审核进群资格,未注明则拒绝

2010-5-23 10:24:24

3 Practical Bayesian Inference for a Finite Mixture Model
with Known Number of Components 57

3.1 Introduction 57

3.2 Choosing the Prior for the Parameters of a Mixture Model 58

3.2.1 Objective and Subjective Priors 58

3.2.2 Improper Priors May Cause Improper Mixture Posteriors 59

3.2.3 Conditionally Conjugate Priors 60

3.2.4 Hierarchical Priors and Partially Proper Priors 61

3.2.5 Other Priors 62

3.2.6 Invariant Prior Distributions 62

3.3 Some Properties of the Mixture Posterior Density 63

3.3.1 Invariance of the Posterior Distribution 63

3.3.2 Invariance of Seemingly Component-Specific Functionals 64

3.3.3 The Marginal Posterior Distribution of the Allocations 65

3.3.4 Invariance of the Posterior Distribution of the Allocations 67

3.4 Classification Without Parameter Estimation 68

3.4.1 Single-Move Gibbs Sampling 69

3.4.2 The Metropolis–Hastings Algorithm 72

3.5 Parameter Estimation Through Data Augmentation and MCMC 73

3.5.1 Treating Mixture Models as a Missing Data Problem 73

3.5.2 Data Augmentation and MCMC for a Mixture of Poisson Distributions 74

3.5.3 Data Augmentation and MCMC for General Mixtures 76

3.5.4 MCMC Sampling Under Improper Priors 78

3.5.5 Label Switching 78

3.5.6 Permutation MCMC Sampling 81

3.6 Other Monte Carlo Methods Useful for Mixture Models 83

3.6.1 A Metropolis–Hastings Algorithm for the Parameters 83

3.6.2 Importance Sampling for the Allocations 84

3.6.3 Perfect Sampling 85

3.7 Bayesian Inference for Finite Mixture Models Using Posterior Draws 85

3.7.1 Sampling Representations of the Mixture Posterior Density 85

3.7.2 Using Posterior Draws for Bayesian Inference 87

3.7.3 Predictive Density Estimation 89

3.7.4 Individual Parameter Inference 91

3.7.5 Inference on the Hyperparameter of a Hierarchical Prior 92

3.7.6 Inference on Component Parameters 92

3.7.7 Model Identification 94

二维码

扫码加我 拉你入群

请注明:姓名-公司-职位

以便审核进群资格,未注明则拒绝

2010-6-8 15:48:46

4 Statistical Inference for Finite Mixture Models Under
Model Specification Uncertainty 99

4.1 Introduction 99

4.2 Parameter Estimation Under Model Specification Uncertainty 100

4.2.1 Maximum Likelihood Estimation Under Model Specification Uncertainty 100

4.2.2 Practical Bayesian Parameter Estimation for Overfitting Finite Mixture Models 103

4.2.3 Potential Overfitting 105

4.3 Informal Methods for Identifying the Number of Components 107

4.3.1 Mode Hunting in the Mixture Posterior 108

4.3.2 Mode Hunting in the Sample Histogram 109

4.3.3 Diagnosing Mixtures Through the Method of Moments 110

4.3.4 Diagnosing Mixtures Through Predictive Methods 112

4.3.5 Further Approaches 114

4.4 Likelihood-Based Methods 114

4.4.1 The Likelihood Ratio Statistic 114

4.4.2 AIC, BIC, and the Schwarz Criterion 116

4.4.3 Further Approaches 117

4.5 Bayesian Inference Under Model Uncertainty 117

4.5.1 Trans-Dimensional Bayesian Inference 117

4.5.2 Marginal Likelihoods 118

4.5.3 Bayes Factors for Model Comparison 119

4.5.4 Formal Bayesian Model Selection 121

4.5.5 Choosing Priors for Model Selection 122

4.5.6 Further Approaches 123

5 Computational Tools for Bayesian Inference for Finite
Mixtures Models Under Model Specification Uncertainty 125

5.1 Introduction 125

5.2 Trans-Dimensional Markov Chain Monte Carlo Methods 125

5.2.1 Product-Space MCMC 126

5.2.2 Reversible Jump MCMC 129

5.2.3 Birth and Death MCMC Methods 137

5.3 Marginal Likelihoods for Finite Mixture Models 139

5.3.1 Defining the Marginal Likelihood 139

5.3.2 Choosing Priors for Selecting the Number of Components 141

5.3.3 Computation of the Marginal Likelihood for Mixture Models 143

5.4 Simulation-Based Approximations of the Marginal Likelihood 143

5.4.1 Some Background on Monte Carlo Integration 143

5.4.2 Sampling-Based Approximations for Mixture Models 144

5.4.3 Importance Sampling 146

5.4.4 Reciprocal Importance Sampling 147

5.4.5 Harmonic Mean Estimator 148

5.4.6 Bridge Sampling Technique 150

5.4.7 Comparison of Different Simulation-Based Estimators 154

5.4.8 Dealing with Hierarchical Priors 159

5.5 Approximations to the Marginal Likelihood Based on Density Ratios 159

5.5.1 The Posterior Density Ratio 159

5.5.2 Chib’s Estimator 160

5.5.3 Laplace Approximation 164

5.6 Reversible Jump MCMC Versus Marginal Likelihoods? 165

二维码

扫码加我 拉你入群

请注明:姓名-公司-职位

以便审核进群资格,未注明则拒绝

2010-6-8 15:49:03

6 Finite Mixture Models with Normal Components 169

6.1 Finite Mixtures of Normal Distributions 169

6.1.1 Model Formulation 169

6.1.2 Parameter Estimation for Mixtures of Normals 171

6.1.3 The Kiefer–Wolfowitz Example 174

6.1.4 Applications of Mixture of Normal Distributions 176

6.2 Bayesian Estimation of Univariate Mixtures of Normals 177

6.2.1 Bayesian Inference When the Allocations Are Known 177

6.2.2 Standard Prior Distributions 179

6.2.3 The Influence of the Prior on the Variance Ratio 179

6.2.4 Bayesian Estimation Using MCMC 180

6.2.5 MCMC Estimation Under Standard Improper Priors 182

6.2.6 Introducing Prior Dependence Among the Components185

6.2.7 Further Sampling-Based Approaches 187

6.2.8 Application to the Fishery Data 188

6.3 Bayesian Estimation of Multivariate Mixtures of Normals 190

6.3.1 Bayesian Inference When the Allocations Are Known 190

6.3.2 Prior Distributions 192

6.3.3 Bayesian Parameter Estimation Using MCMC 193

6.3.4 Application to Fisher’s Iris Data 195

6.4 Further Issues 195

6.4.1 Parsimonious Finite Normal Mixtures 195

6.4.2 Model Selection Problems for Mixtures of Normals 199

7 Data Analysis Based on Finite Mixtures 203

7.1 Model-Based Clustering 203

7.1.1 Some Background on Cluster Analysis 203

7.1.2 Model-Based Clustering Using Finite Mixture Models 204

7.1.3 The Classification Likelihood and the Bayesian MAP Approach 207

7.1.4 Choosing Clustering Criteria and the Number of Components 210

7.1.5 Model Choice for the Fishery Data 216

7.1.6 Model Choice for Fisher’s Iris Data 218

7.1.7 Bayesian Clustering Based on Loss Functions 220

7.1.8 Clustering for Fisher’s Iris Data 224

7.2 Outlier Modeling 224

7.2.1 Outlier Modeling Using Finite Mixtures 224

7.2.2 Bayesian Inference for Outlier Models Based on Finite Mixtures 225

7.2.3 Outlier Modeling of Darwin’s Data 226

7.2.4 Clustering Under Outliers and Noise 227

7.3 Robust Finite Mixtures Based on the Student-t Distribution 230

7.3.1 Parameter Estimation230

7.3.2 Dealing with Unknown Number of Components 233

7.4 Further Issues 233

7.4.1 Clustering High-Dimensional Data 233

7.4.2 Discriminant Analysis 235

7.4.3 Combining Classified and Unclassified Observations 236

7.4.4 Density Estimation Using Finite Mixtures 237

7.4.5 Finite Mixtures as an Auxiliary Computational Tool in Bayesian Analysis 238

二维码

扫码加我 拉你入群

请注明:姓名-公司-职位

以便审核进群资格,未注明则拒绝

点击查看更多内容…
相关推荐
栏目导航
热门文章
推荐文章

说点什么

分享

扫码加好友,拉您进群
各岗位、行业、专业交流群