zj_ocean 发表于 2010-2-28 13:23 
8# bobguy
For ols, we really do not need the error terms to be normal. But if you want to take statistical inference of betas or responses, we always assume that the error terms be normal. Because we have so many statistical tools from normal, like F-test, chi-sq test. And luckly, the Maximum Likelihood estimators (assuming that responses are normal) are same as OLS.
Generalized Linear Models (different from General Linear Models) are used to deal with exponential distribution. The coresponding methods is weighted least sqare with ML method.
Sometime, we might normalize or standardlize the covariates for the units. If your models contains covariates length, volume, speed, or weight with various units, you'd better standardlize the data.
"But if you want to take statistical inference of betas or responses, wealways assume that the error terms be normal. Because we have so manystatistical tools from normal, like F-test, chi-sq test." --- Agree ! For a small sample size, the normality is necessary for statistic inference. This assumption can be relaxed under a large sample size.
"And luckly, the Maximum Likelihood estimators" OLS is almost the same as ML under normality assumption. But variance estimation is biased and need to be adjusted by DF in ML. Both are equivelant under a large sample size. OLS was used by Gauss and others before the normal distribution was introduced by Gauss several years later. And ML was invented by Fisher many years later. I believe he got some idea from Laplace.
"Sometime, we might normalize or standardlize the covariates for theunits. If your models contains covariates length, volume, speed, orweight with various units, you'd better standardlize the data. " When you want to contrast the coefs, the normalization of the covariates makes sense. But it is not necessary.