受到警告
boxcox price s1 s2 dj dl wd1 wd2 bz yj yj ls lable wl tc yf1 yf2 sh1 sh2 pj dz,lrtest
note: yj dropped because of collinearity
Fitting comparison model
Iteration 0: log likelihood = -1282.9176
Iteration 1: log likelihood = -1193.974
Iteration 2: log likelihood = -1193.9265
Iteration 3: log likelihood = -1193.9265
Fitting full model
Iteration 0: log likelihood = -1250.9672
Iteration 1: log likelihood = -1139.5853
Iteration 2: log likelihood = -1138.8727
Iteration 3: log likelihood = -1138.8724
Iteration 4: log likelihood = -1138.8724
Fitting comparison models for LR tests
Iteration 0: log likelihood = -1251.0143
Iteration 1: log likelihood = -1139.7452
Iteration 2: log likelihood = -1139.0025
Iteration 3: log likelihood = -1139.0022
Iteration 4: log likelihood = -1139.0022
Iteration 0: log likelihood = -1251.0193
Iteration 1: log likelihood = -1139.6247
Iteration 2: log likelihood = -1138.895
Iteration 3: log likelihood = -1138.8947
Iteration 4: log likelihood = -1138.8947
Iteration 0: log likelihood = -1251.0205
Iteration 1: log likelihood = -1139.662
Iteration 2: log likelihood = -1138.9803
Iteration 3: log likelihood = -1138.98
Iteration 4: log likelihood = -1138.98
Iteration 0: log likelihood = -1260.6813
Iteration 1: log likelihood = -1161.3836
Iteration 2: log likelihood = -1160.991
Iteration 3: log likelihood = -1160.9909
Iteration 4: log likelihood = -1160.9909
Iteration 0: log likelihood = -1251.3219
Iteration 1: log likelihood = -1140.1492
Iteration 2: log likelihood = -1139.4457
Iteration 3: log likelihood = -1139.4454
Iteration 4: log likelihood = -1139.4454
Iteration 0: log likelihood = -1251.1722
Iteration 1: log likelihood = -1140.1672
Iteration 2: log likelihood = -1139.4447
Iteration 3: log likelihood = -1139.4443
Iteration 4: log likelihood = -1139.4443
Iteration 0: log likelihood = -1253.9474
Iteration 1: log likelihood = -1144.8984
Iteration 2: log likelihood = -1144.1239
Iteration 3: log likelihood = -1144.1235
Iteration 4: log likelihood = -1144.1235
Iteration 0: log likelihood = -1251.0128
Iteration 1: log likelihood = -1139.7619
Iteration 2: log likelihood = -1139.0596
Iteration 3: log likelihood = -1139.0593
Iteration 4: log likelihood = -1139.0593
Iteration 0: log likelihood = -1251.0126
Iteration 1: log likelihood = -1139.589
Iteration 2: log likelihood = -1138.8994
Iteration 3: log likelihood = -1138.8991
Iteration 4: log likelihood = -1138.8991
Iteration 0: log likelihood = -1254.4156
Iteration 1: log likelihood = -1143.1787
Iteration 2: log likelihood = -1142.3763
Iteration 3: log likelihood = -1142.3759
Iteration 4: log likelihood = -1142.3759
Number of obs = 305
LR chi2(18) = 110.11
Log likelihood = -1138.8724 Prob > chi2 = 0.000
------------------------------------------------------------------------------
price | Coef. Std. Err. z P>|z| [95% Conf. Interval]
-------------+----------------------------------------------------------------
/theta | -.2239895 .0896193 -2.50 0.012 -.39964 -.0483389
------------------------------------------------------------------------------
Estimates of scale-variant parameters
-------------------------------------------------------------
| Coef. chi2(df) P>chi2(df) df of chi2
-------------+-----------------------------------------------
Notrans |
s1 | -.0235302 0.260 0.610 1
s2 | -.0051872 0.045 0.833 1
dj | .0143059 0.215 0.643 1
dl | .1938586 44.237 0.000 1
wd1 | .0563133 1.146 0.284 1
wd2 | .0585799 1.144 0.285 1
bz | .0935278 10.502 0.001 1
yj | -.0090733 0.078 0.780 1
ls | -.0283363 1.028 0.311 1
lable | -.048456 2.111 0.146 1
wl | -.0183611 0.403 0.526 1
tc | .0466054 3.493 0.062 1
yf1 | -.1577191 6.161 0.013 1
yf2 | -.1266088 4.195 0.041 1
sh1 | .0644183 1.246 0.264 1
sh2 | .0273976 0.374 0.541 1
pj | .0033248 0.053 0.817 1
dz | -.0117198 7.007 0.008 1
_cons | 2.267277
-------------+-----------------------------------------------
/sigma | .1862849
-------------------------------------------------------------
---------------------------------------------------------
Test Restricted LR statistic P-value
H0: log likelihood chi2 Prob > chi2
---------------------------------------------------------
theta = -1 -1173.1427 68.54 0.000
theta = 0 -1142.0913 6.44 0.011
theta = 1 -1250.9672 224.19 0.000
---------------------------------------------------------
想了解一下这样做对不对,二分虚拟变量可否放进去转换?还有以下几个问题:
1.我的lamdb等于-0.22是否意味着我可以采用对数形还是非对数形式?解释变量是否需要进行boxcox转换?如何转换,程序如何写?
2.在下一部回归是不是这样写:gen lnp=log(price)
reg lnp s1 s2 ......即可?