Generic Conditions for Consistency and Asymptotic Normality

The ML estimator is a special case of an M-estimator. In Chapter 6, the generic conditions for consistency and asymptotic normality of M-estimators, which in most cases apply to ML estimators as well, were derived. The case (8.11) is one of the exceptions, though. In particular, if

Assumption 8.2: plimn^msupee©ln(L„(в)/Ln(в0)) _ E[ln(L„(в)/Ln(0o))]

| = 0 and limn^m supeЄ©E[ln(Ln(в)/Ln(в0))] — і(вв0) = 0, where і(вв0) is a continuous function in в0 such that, for arbitrarily small 8 > 0,

supeЄ©:\в—в0\>8^(вв0) < 0,

then the ML estimator is consistent.

Theorem 8.3: Under Assumption 8.2, plim^^O = в0.

The conditions in Assumption 8.2 need to be verified on a case-by-case basis. In particular, the uniform convergence in probability condition has to be verified from the conditions of the uniform weak law of large numbers. Note that it follows from Theorem II.6 in Appendix II that the last condition in Assumption 8.2, that is, supe Є©:\в _во \>8 і(в в0) < 0, holds if the parameter space © is compact, і(в в0) is continuous on ©, and в0 is unique. The latter follows from Theorem 8.1.

Some of the conditions for asymptotic normality of the ML estimator are already listed in Assumption 8.1 – in particular the convexity of the parameter space © and the condition that в0 be an interior point of ©. The other (high-level) conditions are

Подпись: plim sup n^TO в Є© image676 Подпись: д2 ln( L n (в))/n дві1 двi2 Подпись: = 0 Подпись: (8.31)

Assumption 8.3: For i 1, i2 = 1, 2, 3,…,m,

image680 Подпись: д2 ln(L n (в ))/n двк двi2 Подпись: + hІl,І2(в ) Подпись: = 0, Подпись: (8.32)

and

Подпись: д ln(Ln (в0))Д/й дв(Г Подпись: dNm [0, Hi]. Подпись: (8.33)

whereh^j^) is continuous in в0. Moreover, them x mmatrixH withelements hil, i2(в0) is nonsingular. Furthermore,

Note that the matrix H is just the limit of Hn/n, where Hn is the Fisher information matrix (8.30). Condition (8.31) can be verified from the uniform weak law of large numbers. Condition (8.32) is a regularity condition that accommodates data heterogeneity. In quite a few cases we may take ^ь!2(в) = _n—1 E[д2 ln(Ln(в))/(дв^ дв!2)]. Finally, condition (8.33) can be verified from the central limit theorem.

Theorem 8.4: Under Assumptions 8.1-8.3, „/нф — в0) ^dNm [0, H *].

Proof: It follows from the mean value theorem (see Appendix II) that for each i є{1,m} there exists a іi є [0, 1] such that d ln(Ln (в ))Д/й

Подпись: в=в

image689 image690 Подпись: Vn(0 — во), (8.34)

дві

The first-order condition for (8.2) and the condition that в0 be an interior point of © imply

plim n—1/2д ln(Ln(в))/дві |в=в — 0. (8.35)

n^TO

image692 Подпись: PH. Подпись: (8.36)

Подпись: /

image696

Moreover, the convexity of © guarantees that the mean value в0 + іі (в — в0) is contained in ©.It follows now from the consistency of в and the conditions (8.31) and (8.32) that

The condition that H is nonsingular allows us to conclude from (8.36) and Slutsky’s theorem that

plim H—1 — H~*; (8.37)

n^TO

hence, it follows from (8.34) and (8.35) that

<Жв — в0) — – H—1(д ln( Ln в))/д воT)/Vn + op (1). (8.38)

Theorem 8.4 follows now from condition (8.33) and the results (8.37) and (8.38). Q. E.D.

In the case of a random sample Z1,…, Zn, the asymptotic normality con­dition (8.33) can easily be derived from the central limit theorem for i. i.d. random variables. For example, again let the Zj’s be ^-variate distributed with density f (z|в0). Then it follows from Theorem 8.2 that, under Assumption

8.1,

E [д ln(f (Zj | в0))/д в0^ — n—1E [д ln(L n (в0))/д в0^ — 0

and

Var[ 9 ln(f( Zj |0o))/90oT] = n-iVar[ 9 ln( L n (ЭД)^] = H,

and thus (8.33) straightforwardly follows from the central limit theorem for

i. i. d. random vectors.

8.2.3. Asymptotic Normality in the Time Series Case In the time series case (8.6) we have

Подпись: (8.39)d ln(Ln (&o))/d&T 1

vn vntt ‘

where

Ui = 9 ln(fi( Z i|0o))/90oT,

Ut — d ln(f (Zt | Zt-i, Zi, 0o))/90,T for t > 2. (8.40)

The process Ut is a martingale difference process (see Chapter 7): Letting. t — a(ZiZt) for t > i and designating.0 as the trivial a-algebra {^, 0}, it is easy to verify that, for t > i, E[ut |.t-i] = 0 a. s. Therefore, condition (8.33) can in principle be derived from the conditions of the martingale difference central limit theorems (Theorems 7.i0 and 7.ii) in Chapter 7.

Note that, even if Zt is a strictly stationary process, the Ut’s may not be strictly stationary. In that case condition (8.33) can be proved by specializing Theorem 7.i0 in Chapter 7.

An example of condition (8.33) following from Theorem 7.ii in Chapter 7 is the autoregressive (AR) model of order i:

Zt — a + e Zt-i + &t,

where st is i. i.d. N(0,a2) and |в| < i. (8.4i)

The condition |в| < i is necessary for strict stationarity of Zt. Then, for t > 2, the conditional distribution of Zt, given. t-i — a (Z i, Zt-i), is N (a + вZt-i, a2), and thus, with 0o — (a, в, a2 )T, (8.40) becomes

d(-2(Zt – a – вZt-i)2/a2 – 2 ln(a2) – ln (У2Л))

Подпись: 1 a 2 Подпись: ( St StZt-i \(s2/a2 - Подпись:  i) Подпись: (8.42)

d(a, в, a2)

Because the st’s are i. i.d. N(0, a2) and st and Zt-i are mutually independent, it follows that (8.42) is a martingale difference process not only with respect

to &t = a (1 1, Zt) but also with respect to = a ({1t_ j j=o),thatis,

E[Ut |^__1] = 0 a. s.

Подпись: H = Var(Ut) Подпись: 1 a 2 image704 Подпись: a 1-е 0 a2 , a2 (1_в)2 + 1_в2 0 0 202/

By backwards substitution of(8.41) it follows that 1t =J2JL0 в (a + et_j); hence, the marginal distribution of 11 is N [a/(1 _ в), a2/(1 _ в2)]. However, there is no need to derive U1 in this case because this term is irrelevant for the asymptotic normality of (8.39). Therefore, the asymptotic normality of (8.39) in this case follows straightforwardly from the stationary martingale difference central limit theorem with asymptotic variance matrix

Leave a reply

You may use these HTML tags and attributes: <a href="" title=""> <abbr title=""> <acronym title=""> <b> <blockquote cite=""> <cite> <code> <del datetime=""> <em> <i> <q cite=""> <s> <strike> <strong>