1)This chapter introduces covariance and correlation, two concepts that will prepare the way for the
treatment of regression analysis to come. A second and equally important objective is to show you
how to manipulate expressions involving sample variance and covariance. Several detailed examples
are provided to give you practice. They are used very extensively in future chapters and it is vital that
they become second nature to you. They simplify the mathematics and make the analysis much easier
to follow.
2)This chapter shows how a hypothetical linear relationship between two variables can be quantified
using appropriate data. The principle of least squares regression analysis is explained, and expressions
for the coefficients are derived.
3)With the aid of regression analysis we can obtain estimates of the parameters of a relationship. However,
they are only estimates. The next question to ask is, how reliable are they? We shall answer this first in
general terms, investigating the conditions for unbiasedness and the factors governing their variance.
Building on this, we shall develop a means of testing whether a regression estimate is compatible with a
specific prior hypothesis concerning the true value of a parameter, and hence we shall derive a
confidence interval for the true value, that is, the set of all hypothetical values not contradicted by the
experimental result. We shall also see how to test whether the goodness of fit of a regression equation is
better than might be expected on the basis of pure chance.
4)In this chapter least squares regression analysis is generalized to cover the case in which there are
several or many explanatory variables in the regression model, rather than just one. Two new topics
are discussed. One is the problem of discriminating between the effects of different explanatory
variables, a problem that, when particularly severe, is known as multicollinearity. The other is the
evaluation of the joint explanatory power of the independent variables, as opposed to their individual
marginal effects.