354
Part Two
Relaxing the Assumptions of the Classical Model
a.
“Since the zero-order correlations are very high, there must be serious multi-
collinearity.” Comment.
b.
Would you drop variables
X
2
i
and
X
3
i
from the model?
c.
If you drop them, what will happen to the value of the coefficient of
X
i
?
10.11.
Stepwise regression
. In deciding on the “best” set of explanatory variables for a
regression model, researchers often follow the method of stepwise regression. In
this method one proceeds either by introducing the
X
variables one at a time (
step-
wise forward regression
) or by including all the possible
X
variables in one multi-
ple regression and rejecting them one at a time (
stepwise backward regression
).
The decision to add or drop a variable is usually made on the basis of the
contribution of that variable to the ESS, as judged by the
F
test. Knowing what you
do now about multicollinearity, would you recommend either procedure? Why or
why not?
*
10.12. State
with reason
whether the following statements are true, false, or uncertain:
a.
Despite perfect multicollinearity, OLS estimators are BLUE.
b.
In cases of high multicollinearity, it is not possible to assess the individual sig-
nificance of one or more partial regression coefficients.
c.
If an auxiliary regression shows that a particular
R
2
i
is high, there is definite
evidence of high collinearity.
d.
High pair-wise correlations do not suggest that there is high multicollinearity.
e.
Multicollinearity is harmless if the objective of the analysis is prediction only.
f.
Ceteris paribus, the higher the VIF is, the larger the variances of OLS estimators.
g.
The tolerance (TOL) is a better measure of multicollinearity than the VIF.
h.
You will not obtain a high
R
2
value in a multiple regression if all the partial slope
coefficients are
individually
statistically insignificant on the basis of the usual
t
test.
i.
In the regression of
Y
on
X
2
and
X
3
, suppose there is little variability in the val-
ues of
X
3
. This would increase var (
ˆ
β
3
). In the extreme, if all
X
3
are identical,
var (
ˆ
β
3
) is infinite.
10.13.
a.
Show that if
r
1
i
=
0 for
i
=
2, 3, . . . ,
k
then
R
1
.
2 3
. . .
k
=
0
b.
What is the importance of this finding for the regression of variable
X
1
(
=
Y
) on
X
2
,
X
3
, . . . ,
X
k
?
10.14. Suppose all the zero-order correlation coefficients of
X
1
(
=
Y
),
X
2
, . . . ,
X
k
are
equal to
r.
a.
What is the value of
R
2
1
.
2 3
. . .
k
?
b.
What are the values of the first-order correlation coefficients?
**
10.15. In matrix notation it can be shown (see
Do'stlaringiz bilan baham: