Estimation of a2
The variance of the regression disturbances o2 is unknown and has to be estimated. In fact, both the variance of вOLS and that of aOLS depend upon о2, see (3.6) and problem 5. An unbiased estimator for о2 is s2 = E7=1 e2/(n — 2). To prove this, we need the fact that
ei = Yi — aOLS — PoLS Xi = yi — @OLS xi = (в — @OLS )xi + (ui — u)
where U = Ei=1 ui/n. The second equality substitutes aaOLS = Y — вOLSX and the third equality substitutes yi = вх-і + (ui — u). Hence,
E(E7=i e2) = E7=1 x^var(вoLS) + (n — 1)o2 — 2E(E7=i xu)2/Y.7=i x2
J2 2 2 2
where the first equality uses the fact that E(E7=1 (ui — u)2) = (n — 1)o2 and вOLS — в = En=1 xiui^27=1 x2. The second equality uses the fact that var^ ols) = о2/ EiE x2 and
E(E7=1 xui)2 = о2 E7=1 x2.
Therefore, E(s2) = E(E n=i ef/(n — 2)) = a2.
Intuitively, the estimator of a2 could be obtained from EE^u — u)2/(n — 1) if the true disturbances were known. Since the u’s are not known, consistent estimates of them are used. These are the e^s. Since En= ei = 0, our estimator of a2 becomes En= e2/(n — 1). Taking expectations we find that the correct divisor ought to be (n—2) and not (n —1) for this estimator to be unbiased for a2. This is plausible, since we have estimated two parameters a and в in obtaining the ei’s, and there are only n — 2 independent pieces of information left in the data. To prove this fact, consider the OLS normal equations given in (3.2) and (3.3). These equations represent two relationships involving the ei’s. Therefore, knowing (n — 2) of the ei’s we can deduce the remaining two ei s from (3.2) and (3.3).