Asymptotic Normality of the Least Squares Estimator
As was noted earlier, the first step in obtaining FGLS is calculating LS. Therefore the properties of FGLS depend on the properties of LS. The least squares estimator is consistent in this model if assumption В of Theorem 6.1.2 is satisfied because in this case assumption A is satisfied because of Theorem
5.2.3. In fact, Theorem 5.2.3 states that assumption A is satisfied even when u follows a more general process than AR( 1). So in this subsection we shall prove only the asymptotic normality of LS, and we shall do so for a process of u more general than AR(1) but only for the case of one regressor (that is, К = 1) in Model 6. (Anderson, 1971, p. 585, has given the proof in the case of К regressors. He assumed an even more general process for u than the one we assume in the following proof.)
Theorem 6.3.1. Assume К = 1 in Model 6. Because X is a Г-vector in this case, we shall denote it by x and its ah element by xt. Assume
(A) limr_„ ^ = c, # 0.
(B) u, = 2y“_02"-0|ф;1 < °°> where (ej are i. i.d. with Ее, = 0 and
Then fTifi — /J) — N(0, c, 2c2), where c2 — limr_„, Г-‘х’Гии’х.
Proof. We need only prove Г-1/2х’и —* jV(0, c2) because then the theorem follows from assumption A and Theorem 3.2.7 (iii) (Slutsky). We can write
But V(A2) = T 1 x’ xer 2(2JL „+1ф^. Therefore A2 can be ignored if one takes N large enough. We have
But V(An) = T~1 a2NM2(1?L 1ІФ/І)2, which goes to 0 as Г—»°° for a fixed N. The same is true for AI3. The theorem follows by noting that 2jL0<£/*t+/ satisfies the condition for x, in Theorem 3.S.3.