Chapter 9 Summary
Foundations * Subjective Probability * Coherence * Decision Theory * Complete Class Theorem * Large Sample Theory
Beta–Binomial & Conjugate Normal Models * Preliminaries – Binomial Distribution – Beta Distribution – Normal Distribution – Gamma and Inverted Gamma Distributions – T–Distribution * Bayesian Inference – Joint Distribution – Marginal Distribution – Posterior Distribution – Predictive Distribution
336 CHAPTER 9. SUMMARY Linear Regression * Preliminaries – Multivariate Normal Distribution – Gamma and Inverted Gamma Distributions * Bayesian Inference – Full Conditionals – MCMC * Slice Sampling * Autocorrelated Errors
337 Multivariate Regression * Preliminaries – Matrix Facts – Matrix Normal Distribution – Wishart and Inverted Wishart Distributions * Bayesian Inference – Full Conditionals – MCMC
338 CHAPTER 9. SUMMARY Hierarchical Bayes Regression: Interaction Model * Preliminaries – Application of multiple and multivariate regression * Bayesian Inference – Within & Between Models – Full Conditionals
Hierarchical Bayes Regression: Mixture Model * Preliminaries – Multinomial Distribution – Dirichlet Distribution – Mixture Distributions * Bayesian Inference – Latent Variables
340 CHAPTER 9. SUMMARY Probit Model * Preliminaries – Random Utility Model * Bayesian Inference – Latent Variables
341 Logit Model * Preliminaries – Extreme Value Distribution * Bayesian Inference – Hastings–Metropolis Algorithm
342 CHAPTER 9. SUMMARY Conclusion * Good models include all major sources of uncertainty and variation. * Bayesian inference explicitly account for these sources. * MCMC has proven to be a flexible method of analyzing complex models. * This course has presented the basic framework. * As the complexity of your problems increase, you will want to go beyond the basics.