partial regression coefficients,
and their meaning will be explained shortly.
We continue to operate within the framework of the classical linear regression model
(CLRM) first introduced in Chapter 3. As a reminder, we assume the following:
2
This assumption is automatically fulfilled if
X
2
and
X
3
are nonstochastic and Eq. (7.1.4) holds.
1. Linear regression model, or
linear in the parameters.
(7.1.2)
2. Fixed
X
values or
X
values independent of the error term. Here, this means
we require zero covariance between
u
i
and each
X
variables.
cov (
u
i
,
X
2
i
)
cov (
u
i
,
X
3
i
)
0
(7.1.3)
2
3. Zero mean value of disturbance
u
i
.
E
(
u
i
|
X
2
i,
X
3
i
)
0
for each
i
(7.1.4)
4. Homoscedasticity or constant variance of
u
i
.
var (
u
i
)
σ
2
(7.1.5)
5. No autocorrelation, or serial correlation, between the disturbances.
cov (
u
i
,
u
j
)
0
i
=
j
(7.1.6)
6. The number of observations
n
must be greater than the number of
parameters to be estimated, which is 3 in our current case.
(7.1.7)
7. There must be variation in the values of the
X
variables.
(7.1.8)
We will also address two other requirements.
8. No exact collinearity between the
X
variables.
No
Do'stlaringiz bilan baham: |