ARIMA Processes, and the PhillipsPerron test
The ADF test requires that the order p of the AR model involved is finite, and correctly specified, i. e. the specified order should not be smaller than the actual order. In order to analyze what happens if p is misspecified, suppose that the actual data generating process is given by (29.39) with a0 = a 2 = 0 and p > 1, and that the unit root hypothesis is tested on the basis of the assumption that p = 1. Denoting ef = u/a, model (29.39) with a0 = a2 = 0 can be rewritten as
where y(L) = a(L) 1, with a(L) = 1 – ajL. This data generating process can be
nested in the auxiliary model
Ayf = a0 + a 1 yf1 + uf, uf = y(L)ef, ef ~ iid N(0, 1). (29.60)
We will now determine the limiting distribution of the OLS estimate a 1 and corresponding fvalue t1 of the parameter a 1 in the regression (29.60), derived under the assumption that the u s are independent, while in reality (29.59) holds.
Similarly to (29.41) we can write Ayf = y(1)e f + vf – vf1, where vf = [(y(L) – y(1))/ (1 – L)]e is a stationary process. The latter follows from the fact that by construction the lag polynomial y (L) – y (1) has a unit root, and therefore contains a factor 1 – L. Next, redefining Wn(x) as
[nx]
Wn(x) = (1 yn) ^et if x Є [n_1, 1], Wn(x) = 0 if x Є [0, n1), (29.61)
t=1

































in distribution, and (29.34) becomes:
* (1/2)(W(1)2 – X) – Wn(1){Wn(x)dx + . (1) ,
Г1 = + ^u(1) ^
/W„(x)2dx – (fWn(x)dx)2 т + 0.5(1 – X)
1 V/W(x)2dx – (jW(x)dxt)2
in distribution. These results carry straightforwardly over to the case where the actual data generating process is an ARIMA process a(L)Ayt = P(L)et, simply by redefining у (L) = P(L)/a(L).
The parameter у (1)2 is known as the longrun variance of ut = у (L)et:
(29.70)
which in general is different from the variance of ut itself:
If we would know a2L and a U, and thus X = a U/a L, then it follows from (29.64), (29.65), and Lemma 1, that
It is an easy exercise to verify that this result also holds if we replace yt1 by yt and у1 by у = (1/n)Xn=1 yt. Therefore, it follows from (29.68) and (29.72) that:
Theorem 3. (Phillips – Perron test 1) Under the unit root hypothesis, and given consistent estimators gL and d2U of GL and aU, respectively, we have
This correction of (29.68) has been proposed by Phillips and Perron (1988) for particular estimators gL and G2U, following the approach of Phillips (1987) for the case where the intercept a 0 in (29.60) is assumed to be zero.
It is desirable to choose the estimators g2l and GU such that under the stationarity alternative, plim n^„Z1 = ^. We show now that this is the case if we choose
1 n
G2u = X uL, where щ = Ayt – a0 – aayt1, (L9.74)
n T~1
and GL such that gL = plimn^„GL > 0 under the alternative of stationarity.
First, it is easy to verify that 6l is consistent under the null hypothesis, by verifying that (29.57) still holds. Under stationarity we have plim = cov(yt, yt_i) /var(y) – 1 = a*, say, plim^aо = – a*E(yt) = a*, say, and plimn^62„ = (1 – (a* + 1)2)var( yt) = o*, say. Therefore,
plim Z1/n = 0.5(a*2 + . L/var( yt)) < 0. (29.75)
n—— ^
Phillips and Perron (1988) propose to estimate the longrun variance by the NeweyWest (1987) estimator
62 = 6l + 2X [1 – i/(m + 1)](1/n) X Ul, (29.76)
i=1 t=i+1
where йі is defined in (29.74), and m converges to infinity with n at rate o(n1/4). Andrews (1991) has shown (and we will show it again along the lines in Bierens, 1994) that the rate o(n1/4) can be relaxed to o(n1/2). The weights 1 – j/(m + 1) guarantee that this estimator is always positive. The reason for the latter is the following. Let i* = it for t = 1,…, n, and i* = 0 for t < 1 and t > n. Then,
n+m 6*2 . І X nt=1 
m n+m m1 mj n+m X – X i*l + 2—^ XX – X i*i* і i m + 1 j=0 nU * 1 m + 1 j=0& ntl * 4 1 
i*i*i 




is positive, and so is cl. Next, observe from (29.62) and (29.74) that
й = it – Vn&1 Y(1) Wn(t/n) – a 1Vt + a ^ – y0) – a0. (29.78)
Since
E (1/n) Xn= 1+i itWn((t – i)/n) < V(1/n)Xn=1+;E(i?^/(1/n)Xn= 1+i E(Wn ((t – i)/n)2) = 0(1),
cl – 6*2 = 0p(1/n) + 0, 
it follows that (1/n) Xn= 1+1 itWn((t – i )/n) = 0p(1). Similarly, (1/n) Xn= 1+i itiWn(t/ n) = 0p(1). Moreover, a 1 = 0p(1/n), and similarly, it can be shown that a0 = 0,(1/Jn). Therefore, it follows from (29.77) and (29.78) that
A similar result holds under the stationarity hypothesis. Moreover, substituting
ut = a? et + vt – vt1, and denoting e* = et, v* = vt for t = 1, …, n, v* = e* = 0 for t < 1 and t > n, it is easy to verify that under the unit root hypothesis,
d*2 =± I 
e*4 + 
= OL1 ^ I e*j + 2°T1 ” v. ■ ” ” 
v* – V tm 
V* – V tm ■Jm + 1 
A
A similar result holds under the stationarity hypothesis. Thus:
Theorem 4. Let m increase with n to infinity at rate o(n1/2). Then under both the unit root and stationarity hypothesis, plimn^(62L – d*2) = 0. Moreover, under the unit root hypothesis, plimn^„6?2 = a? and under the stationarity hypothesis, plim n^„6*2 > 0. Consequently, under stationarity, the PhillipsPerron test satisfies plimn^„Z1/n < 0.
Finally, note that the advantage of the PP test is that there is no need to specify the ARIMA process under the null hypothesis. It is in essence a nonparametric test. Of course, we still have to specify the NeweyWest truncation lag m as a function of n, but as long as m = o( sju), this specification is asymptotically not critical.
Leave a reply