Contents
Preface
Chapter 1:
Probability and Measure
1.1. The Texas lotto
1.1.1 Introduction
1.1.2 Binomial numbers
1.1.3 Sample space
1.1.4 Algebras and sigma-algebras of events
1.1.5 Probability measure
1.2. Quality control
1.2.1 Sampling without replacement
1.2.2 Quality control in practice
1.2.3 Sampling with replacement
1.2.4 Limits of the hypergeometric and binomial probabilities
1.3. Why do we need sigma-algebras of events?
1.4. Properties of algebras and sigma-algebras
1.4.1 General properties
1.4.2 Borel sets
1.5. Properties of probability measures
1.6. The uniform probability measure
1.6.1 Introduction
1.6.2 Outer measure
1.7. Lebesgue measure and Lebesgue integral
1.7.1 Lebesgue measure
1.7.2 Lebesgue integral
1.8. Random variables and their distributions3
1.8.1 Random variables and vectors
1.8.2 Distribution functions
1.9. Density functions
1.10. Conditional probability, Bayes' rule, and independence
1.10.1 Conditional probability
1.10.2 Bayes' rule
1.10.3 Independence
1.11. Exercises
Appendices:
1.A. Common structure of the proofs of Theorems 6 and 10
1.B. Extension of an outer measure to a probability measure
Chapter 2:
Borel Measurability, Integration, and Mathematical Expectations
2.1. Introduction
2.2. Borel measurability
2.3. Integrals of Borel measurable functions with respect to a probability measure
2.4. General measurability, and integrals of random variables with respect to probability
measures
2.5. Mathematical expectation
2.6. Some useful inequalities involving mathematical expectations
2.6.1 Chebishev's inequality
2.6.2 Holder's inequality
2.6.3 Liapounov's inequality
2.6.4 Minkowski's inequality
2.6.5 Jensen's inequality
2.7. Expectations of products of independent random variables
2.8. Moment generating functions and characteristic functions
2.8.1 Moment generating functions4
2.8.2 Characteristic functions
2.9. Exercises
Appendix:
2.A. Uniqueness of characteristic functions
Chapter 3:
Conditional Expectations
3.1. Introduction
3.2. Properties of conditional expectations
3.3. Conditional probability measures and conditional independence
3.4. Conditioning on increasing sigma-algebras
3.5. Conditional expectations as the best forecast schemes
3.6. Exercises
Appendix:
3.A. Proof of Theorem 3.12
Chapter 4:
Distributions and Transformations
4.1. Discrete distributions
4.1.1 The hypergeometric distribution
4.1.2 The binomial distribution
4.1.3 The Poisson distribution
4.1.4 The negative binomial distribution
4.2. Transformations of discrete random vectors
4.3. Transformations of absolutely continuous random variables
4.4. Transformations of absolutely continuous random vectors
4.4.1 The linear case
4.4.2 The nonlinear case
4.5. The normal distribution5
4.5.1 The standard normal distribution
4.5.2 The general normal distribution
4.6. Distributions related to the normal distribution
4.6.1 The chi-square distribution
4.6.2 The Student t distribution
4.6.3 The standard Cauchy distribution
4.6.4 The F distribution
4.7. The uniform distribution and its relation to the standard normal distribution
4.8. The gamma distribution
4.9. Exercises
Appendices:
4.A: Tedious derivations
4.B: Proof of Theorem 4.4
Chapter 5:
The Multivariate Normal Distribution and its Application to Statistical Inference
5.1. Expectation and variance of random vectors
5.2. The multivariate normal distribution
5.3. Conditional distributions of multivariate normal random variables
5.4. Independence of linear and quadratic transformations of multivariate normal
random variables
5.5. Distribution of quadratic forms of multivariate normal random variables
5.6. Applications to statistical inference under normality
5.6.1 Estimation
5.6.2 Confidence intervals
5.6.3 Testing parameter hypotheses
5.7. Applications to regression analysis
5.7.1 The linear regression model
5.7.2 Least squares estimation6
5.7.3 Hypotheses testing
5.8. Exercises
Appendix:
5.A. Proof of Theorem 5.8
Chapter 6:
Modes of Convergence
6.1. Introduction
6.2. Convergence in probability and the weak law of large numbers
6.3. Almost sure convergence, and the strong law of large numbers
6.4. The uniform law of large numbers and its applications
6.4.1 The uniform weak law of large numbers
6.4.2 Applications of the uniform weak law of large numbers
6.4.2.1 Consistency of M-estimators
6.4.2.2 Generalized Slutsky's theorem
6.4.3 The uniform strong law of large numbers and its applications
6.5. Convergence in distribution
6.6. Convergence of characteristic functions
6.7. The central limit theorem
6.8. Stochastic boundedness, tightness, and the Op and op notations
6.9. Asymptotic normality of M-estimators
6.10. Hypotheses testing
6.11. Exercises
Appendices:
6.A. Proof of the uniform weak law of large numbers
6.B. Almost sure convergence and strong laws of large numbers
6.C. Convergence of characteristic functions and distributions7
Chapter 7:
Dependent Laws of Large Numbers and Central Limit Theorems
7.1. Stationarity and the Wold decomposition
7.2. Weak laws of large numbers for stationary processes
7.3. Mixing conditions
7.4. Uniform weak laws of large numbers
7.4.1 Random functions depending on finite-dimensional random vectors
7.4.2 Random functions depending on infinite-dimensional random vectors
7.4.3 Consistency of M-estimators
7.5. Dependent central limit theorems
7.5.1 Introduction
7.5.2 A generic central limit theorem
7.5.3 Martingale difference central limit theorems
7.6. Exercises
Appendix:
7.A. Hilbert spaces
Chapter 8:
Maximum Likelihood Theory
8.1. Introduction
8.2. Likelihood functions
8.3. Examples
8.3.1 The uniform distribution
8.3.2 Linear regression with normal errors
8.3.3 Probit and Logit models
8.3.4 The Tobit model
8.4. Asymptotic properties of ML estimators
8.4.1 Introduction
8.4.2 First and second-order conditions8
8.4.3 Generic conditions for consistency and asymptotic normality
8.4.4 Asymptotic normality in the time series case
8.4.5 Asymptotic efficiency of the ML estimator
8.5. Testing parameter restrictions
8.5.1 The pseudo t test and the Wald test
8.5.2 The Likelihood Ratio test
8.5.3 The Lagrange Multiplier test
8.5.4 Which test to use?
8.6. Exercises
Appendix I:
Review of Linear Algebra
I.1. Vectors in a Euclidean space
I.2. Vector spaces
I.3. Matrices
I.4. The inverse and transpose of a matrix
I.5. Elementary matrices and permutation matrices
I.6. Gaussian elimination of a square matrix, and the Gauss-Jordan iteration for
inverting a matrix
I.6.1 Gaussian elimination of a square matrix
I.6.2 The Gauss-Jordan iteration for inverting a matrix
I.7. Gaussian elimination of a non-square matrix
I.8. Subspaces spanned by the columns and rows of a matrix
I.9. Projections, projection matrices, and idempotent matrices
I.10. Inner product, orthogonal bases, and orthogonal matrices
I.11. Determinants: Geometric interpretation and basic properties
I.12. Determinants of block-triangular matrices
I.13. Determinants and co-factors
I.14. Inverse of a matrix in terms of co-factors9
I.15. Eigenvalues and eigenvectors
I.15.1 Eigenvalues
I.15.2 Eigenvectors
I.15.3 Eigenvalues and eigenvectors of symmetric matrices
I.16. Positive definite and semi-definite matrices
I.17. Generalized eigenvalues and eigenvectors
I.18 Exercises
Appendix II:
Miscellaneous Mathematics
II.1. Sets and set operations
II.1.1 General set operations
II.1.2 Sets in Euclidean spaces
II.2. Supremum and infimum
II.3. Limsup and liminf
II.4. Continuity of concave and convex functions
II.5. Compactness
II.6. Uniform continuity
II.7. Derivatives of functions of vectors and matrices
II.8. The mean value theorem
II.9. Taylor's theorem
II.10. Optimization
Appendix III:
A Brief Review of Complex Analysis
III.1. The complex number system
III.2. The complex exponential function
III.3. The complex logarithm
III.4. Series expansion of the complex logarithm
III.5. Complex integration
References
[此贴子已经被作者于2009-5-4 14:04:40编辑过]