. logit opi zpbg size lev quick inv st roa big4 soe da a b c d e f g h j k y1 y2 if big4==1
note: b != 0 predicts success perfectly
b dropped and 17 obs not used
note: c != 0 predicts success perfectly
c dropped and 112 obs not used
note: e != 0 predicts success perfectly
e dropped and 7 obs not used
note: g != 0 predicts success perfectly
g dropped and 11 obs not used
note: h != 0 predicts success perfectly
h dropped and 8 obs not used
note: j != 0 predicts success perfectly
j dropped and 14 obs not used
note: k != 0 predicts success perfectly
k dropped and 13 obs not used
note: soe != 1 predicts failure perfectly
soe dropped and 1 obs not used
note: big4 dropped due to collinearity
note: a dropped due to collinearity
Iteration 0: log likelihood = -8.4376183
Iteration 1: log likelihood = -3.5089385
Iteration 2: log likelihood = -1.1440883
Iteration 3: log likelihood = -.49204537
Iteration 4: log likelihood = -.20385327
Iteration 5: log likelihood = -.07984997
Iteration 6: log likelihood = -.03022338
Iteration 7: log likelihood = -.01128073
Iteration 8: log likelihood = -.00418617
Iteration 9: log likelihood = -.00154877
Iteration 10: log likelihood = -.00057192
Iteration 11: log likelihood = -.0002109
Iteration 12: log likelihood = -.00007769
Iteration 13: log likelihood = -.00002859
Iteration 14: log likelihood = -.0000105
Iteration 15: log likelihood = -3.854e-06
Iteration 16: log likelihood = -1.408e-06
Iteration 17: log likelihood = -4.887e-07
Iteration 18: log likelihood = -1.724e-07
Iteration 19: log likelihood = -3.156e-08
Logistic regression Number of obs = 51
LR chi2(12) = 16.88
Prob > chi2 = 0.1544
Log likelihood = 0 Pseudo R2 = 1.0000