http://highered.mcgraw-hill.com/sites/0072970928/student_view0/Chp 1 The Nature and Scope of Econometrics
http://www.garpdigitallibrary.org/display/product.asp?pid=1986Research in economics, finance, management, marketing, and related disciplines is becoming increasingly quantitative. Beginning students in these fields are encouraged, if not required, to take a course or two in econometrics—a field of study that has become quite popular. This chapter gives the beginner an overview of what econometrics is all about.
1.1 What Is Econometrics?
1.2 Why Study Econometrics?
1.3 The Methodology of Econometrics
1.3.1 Creating a Statement of Theory or Hypothesis
1.3.2 Collecting Data
1.3.3 Specifying the mathematical model of labor force participation
1.3.4 Specifying the statistical, or econometric, model of labor force participation
1.3.5 estimating the parameters of the chosen econometric model
1.3.6 checking for model adequacy: model specification testing
1.3.7 Testing the hypothesis derived from the model
1.3.8 Using the model for prediction or forecasting
1.4 The road ahead
Key terms and concepts
Questions
Problems
Appendix 1A: Economic Data on the World Wide Web
Chp2 Review of Statistics: Probability and Probability Distributions
http://www.garpdigitallibrary.org/display/product.asp?pid=1997The purpose of this and the following three chapters is to review some fundamental statistical concepts that are needed to understand Essentials of Econometrics. These three chapters will serve as a refresher course for those students who have had a basic course in statistics and will provide a unified framework for following discussions of the material in the remaining parts of this book for those whose knowledge of statistics has become somewhat rusty. Students who have had very little statistics should supplement these three chapters with a good statistics book. (Some references are given at the end of this chapter.)Note that the discussion in Chapters 2 through 5 is nonrigorous and is by no means a substitute for a basic course in statistics. It is simply an overview that is intended as a bridge to econometrics.
2.1 Some notation
2.2 Experiment, sample space, sample point, and events
2.3 Random variables
2.4 Probability
2.5 Random variables and their probability distributions
2.6 Multivariate probability density functions
2.7 Summary and conclusions
Key terms and concepts
References
Questions
Problems
Chp3 Characteristics of Probability Distributions
http://www.garpdigitallibrary.org/display/product.asp?pid=1999Although a PMF (PDF) indicates the values taken by a random variable (r.v.) and their associated probabilities, often we are not interested in the entire PMF. Thus, in the PMF of Example 2.13 we may not want the individual probabilities of obtaining no heads, one head, or two heads. Rather, we may wish to find out the average number of heads obtained when tossing a coin several times. In other words, we may be interested in some summary characteristics, or more technically, the moments of a probability distribution. Two of the most commonly used summary measures or moments are the expected value (called the first moment of the probability distribution) and the variance (called the second moment of the probability distribution). On occasion, we will need higher moments of probability distributions, which we will discuss as we progress.
3.1 Expected value: a measure of central tendency
3.2 Variance: a measure of dispersion
3.3 Covariance
3.4 Correlation coefficient
3.5 Conditional expectation
3.6 Skewness and kurtosis
3.7 From the population to the sample
3.8 Summary
Key terms and concepts
Questions
Problems
Optional exercises
Chp4 Some Important Probability Distributions
http://www.garpdigitallibrary.org/display/product.asp?pid=2000In the previous chapter we noted that a random variable (r.v.) can be described by a few characteristics, or moments, of its probability function (PDF or PMF), such as the expected value and variance. This, however, presumes that we know the PDF of that r.v., which is a tall order since there are all kinds of random variables. In practice, however, some random variables occur so frequently that statisticians have determined their PDFs and documented their properties. For our purpose, we will consider only those PDFs that are of direct interest to us. But keep in mind that there are several other PDFs that statisticians have studied which can be found in any standard statistics textbook. In this chapter we will discuss the following four probability distributions: 1. The normal distribution 2. The t distribution 3. The chi-square (2) distribution 4. The F distribution
4.1 The Normal distribution
4.2 The t distribution
4.3 The chi-square probability distribution
4.4 The F distribution
4.5 Summary
Key terms and concepts
Questions
Problems
Chp5 Statistical Inference: Estimation and Hypothesis Testing
http://www.garpdigitallibrary.org/display/product.asp?pid=2002Equipped with the knowledge of probability; random variables; probability distributions; and characteristics of probability distributions, such as expected value, variance, covariance, correlation, and conditional expectation, we are now ready to undertake the important task of statistical inference. Broadly speaking, statistical inference is concerned with drawing conclusions about the nature of some population (e.g., the normal) on the basis of a random sample that has supposedly been drawn from that population. Thus, if we believe that a particular sample has come from a normal population and we compute the sample mean and sample variance from that sample, we may want to know what the true (population) mean is and what the variance of that population may be.
5.1 The meaning of statistical inference
5.2 Estimation and hypothesis testing: twin branches of statistical inference
5.3 Estimation of parameters
5.4 Properties of point estimators
5.5 Statistical inference: hypothesis testing
5.6 Summary
Key terms and concepts
Questions
Problems
Chp6 Basic Ideas of Linear Regression: the Two-Variable Model
http://www.garpdigitallibrary.org/display/product.asp?pid=2004Chapter 6 discusses the basic ideas of linear regression in terms of the simplest possible linear regression model, in particular, the two-variable model. We make an important distinction between the population regression model and the sample regression model and estimate the former from the latter. This estimation is done using the method of least squares, one of the popular methods of estimation.
6.1 Meaning of regression
6.2 The population regression function (PRF): a hypothetical example
6.3 Statistical or stochastic specification of the population regression function
6.4 The nature of the stochastic error term
6.5 The sample regression function (SRF)
6.6 The special meaning of the term "linear" regression
6.7 Two-variable versus multiple linear regression
6.8 Estimation of parameters: the method of ordinary least squares
6.9 Putting it all together
6.10 Some illustrative examples
6.11 Summary
Key terms and concepts
Questions
Problems
Optional questions
Appendix 6A: Derivation of least-squares estimates
Chp7 The Two-Variable Model: Hypothesis Testing
http://www.garpdigitallibrary.org/display/product.asp?pid=2006Chapter 7 considers hypothesis testing. As in any hypothesis testing in statistics, we try to find out whether the estimated values of the parameters of the regression model are compatible with the hypothesized values of the parameters. We do this hypothesis testing in the context of the classical linear regression model (CLRM). We discuss why the CLRM is used and point out that the CLRM is a useful starting point. In Part III we will reexamine the assumptions of the CLRM to see what happens to the CLRM if one or more of its assumptions are not fulfilled.
7.1 The classical linear regression model
7.2 Variances and standard errors of ordinary least squares estimators
7.3 Why OLS? The properties of OLS estimators
7.4 The sampling, or probability, distributions of OLS estimators
7.5 Hypothesis testing
7.6 How good is the fitted regression line: the coefficient of determination, r^2
7.7 Reporting the results of regression analysis
7.8 Computer output of the Lotto example
7.9 Normality tests
7.10 A concluding example: relationship between wages and productivity in the US business sector, 1959-2000
7.11 A word about forecasting
7.12 Summary
Key terms and concepts
Questions
Problems
Chp8 Multiple Regression: Estimation and Hypothesis Testing
http://www.garpdigitallibrary.org/display/product.asp?pid=2007Chapter 8 extends the idea of the two-variable linear regression model developed in the previous two chapters to multiple regression models, that is, models having more than one explanatory variable. Although in many ways the multiple regression model is an extension of the two-variable model, there are differences when it comes to interpreting the coefficients of the model and in the hypothesis-testing procedure.
8.1 The three-variable linear regression model
8.2 Assumptions of multiple linear regression model
8.3 Estimation of parameters of multiple regression
8.4 Goodness of fit of estimated multiple regression: multiple coefficient of determination, R^2
8.5 Antique clock auction prices revisited
8.6 Hypothesis testing in a multiple regression: general comments
8.7 Testing hypotheses about individual partial regression coefficients
8.8 Testing the joint hypothesis that B_2 = B_3 = 0 or R^2 =0
8.9 Two-variable regression in the context of multiple regression: introduction to specification bias
8.10 Comparing two R^2 values: the adjusted R^2
8.11 When to add an additional explanatory variable to a model
8.12 Restricted least squares
8.13 Illustrative examples
8.14 Summary
Key terms and concepts
Questions
Problems
Appendix 8A: Derivations of OLS estimators given in equations (8.20) and (8.22)