# Convergence properties and transformations

We are often interested in the convergence properties of transformed random vectors or variables. In particular, suppose Zn converges to Z in a certain mode, then given a function g we may ask the question whether or not g(Z n) converges to g(Z) in the same mode. The following theorem answers the question in the affirmative, provided g is continuous (in the sense specified below). Part (a) of the theorem is commonly referred to as Slutsky’s theorem.

Theorem 14.9 Let Z1, Z2,…, and Z be random vectors in R*. Furthermore, let g : R* ^ Rs be a Borel-measurable function and assume that g is continuous with Pz-probability one (where Pz denotes the probability measure induced by Z on R*).10 Then

(a) Z n Jp Z implies g(Z n) ^ g(Z),

(b) Z n J Z implies g(Z n) J g(Z),

(c) Zn J Z implies g(Zn) J g(Z).

In the special case where Z = c is a constant or a vector of constants, the continuity condition on g in the above theorem only requires that the function g is continuous at c.

As special cases of Theorem 14 we have, for example, the following corollaries.

Corollary 2. Let Wn and Vn be sequences of *-dimensional random vectors. Suppose Wn ^ W and Vn ^ V i. p. [a. s.], then

Wn ± Vn ^ W ± V i. p. [a. s.],

W’nVn ^ W V i. p. [a. s.].

In case * = 1,

Wn/Vn ^ W/V i. p. [a. s.]

if V Ф 0 with probability one, and where Wn/ Vn is set to an arbitrary value on the event {Vn = 0}.11

Proof. The assumed convergence of Wn and Vn implies that Zn = (W’n, V’n)’ converges to Z = (W’, V’)’ i. p. [a. s.] in view of Theorem 7. The corollary then follows from Theorem 14(a), (b) since the maps g1(w, v) = w + v, g2(w, v) = w – v, g3(w, v) = w’v are continuous on all of R2*, and since the map g4(w, v) = w/v if v Ф 0 and g4(w, v) = c for v = 0 (with c arbitrary) is continuous on A = R x (R – {0}), observing furthermore that PZ(A) = 1 provided V Ф 0 with probability 1. ■

The proof of the following corollary is completely analogous.

Corollary 3. Let Wn and Vn be sequences of random matrices of fixed dimension. Suppose Wn ^ W and Vn ^ V i. p. [a. s.], then

Wn ± Vn ^ W ± V i. p. [a. s.],

WnVn ^ WV i. p. [a. s.].

Furthermore

WnV-1 ^ WV-1 and V-1Wn ^ V^W i. p. [a. s.]

if V is nonsigular with probability one, and where WnVn1 and V-:Wn are set to an arbitrary matrix of appropriate dimension on the event {Vn singular}. (The matrices are assumed to be of conformable dimensions.)

The following example shows that convergence in probability or almost surely in Corollaries 2 and 3 cannot be replaced by convergence in distribution.

Example 5. Let U ~ N(0, 1) and define Wn = U and Vn = (-1)nU. Then

W V – J2U ~ N(0, 4) if n is even [ 0 if n is odd.

Clearly, Wn + Vn does not converge in distribution, although Wn — U and Vn – U.

The reason behind this negative result is again the fact that convergence in distribution of the components of a random vector does in general not imply convergence in distribution of the entire random vector. Of course, if the entire random vector Zn = (Wn, Vn)’ converges in distribution to Z = (W’,V’)’ then Wn ± Vn — W ± V, W’nVn — W’V as a consequence of Theorem 14; also, if k = 1 and V Ф 0 with probability 1, then Wn/Vn — W/V.

However, there is an important special case in which we can conclude that Zn = (Wn, Vn)’ Z = (W’,V’)’ from knowing that Wn — W and Vn — V: this is

the case where V = c and c is a constant vector.

Theorem 15. Let Wn and Vn be sequences of k x 1 and l x 1 random vectors, respectively. Let W be a k x 1 random vector and let V = c be a constant vector in Rl. Suppose Wn — W and Vn — c (or equivalently Vn— c in light of Theorem 10). Then Zn = (Wn, Vn)’ — Z = (W, V)’ = (W’, d)’.

Proof. Let фn(t) and ф(^ denote, respectively, the characteristic function of Zn and Z. To show that Zn — Z it suffices to show that Фп(^ ^ ф(t) for all t C Rk+l in light of the multivariate version of Theorem 8. Let t = (s’, u’)’ with s C Rk and u C Rl arbitrary. Observing that |exp(fs’Wn)| = 1 = |exp(fu’c)|, we have

| фп(і) – ф(і)| = | E(eis’W-eiu’V- – eis’Weiu’c)

< E [| eis’W1| eiu’V- – eiu’c |] + | eiu’c || E(eis’W- – eis’W)

< E | eiu’V – – eiu’c | + | E(eis’W – – eis’W)|

= E | eiu’V – – eiu’c | + | фW(s) – ф№(5)|,

where ф’П^(s) and ф^) denote, respectively, the characteristic function of Wn and W. Since Vn — c it follows from Theorem 14 that exp(iu’Vn) – exp(zVc) — 0. Observing that |exp(zVVn) – exp(zV c)| < 2 it follows furthermore from Theorem 6 that E |exp(zVVn) – exp(zM’c)| ^ 0. By assumption Wn — W. It then follows again from the multivariate version of Theorem 8 that ф^" (s) ^ ф^). Thus both terms in the last line of (10.6) converge to zero, and hence ф^і) ^ ф(і). ■

Given Theorem 15 the following result follows immediately from Theorem 14.

Corollary 4. Let Wn and Vn be sequences of k x 1 and l x 1 random vectors, respectively. Let W be a k x 1 random vector and c a constant vector in Rl. Suppose Wn — W and Vn — c (or equivalently Vn — c). Let g : Rk x Rl ^ Rs be a

Borel measurable function and assume that g is continuous in every point of A x {c} where A C Rk satisfies P(W Є A) = 1. Then g(Wn, Vn) — g(W, c).

As a further corollary we have the following useful results.

Corollary 5. Let Wn and Vn be sequences of k x 1 and l x 1 random vectors, let An and Bn be sequences of l x k and k x k random matrices, respectively. Furthermore, let W be a k x 1 random vector, let c be a l x 1 nonstochastic vector, and let A and B be some nonstochastic l x k and k x k matrices.

(a) For k = l

Wn — W, Vn — c implies Wn ± Vn — W ± c

W’nVn — W c.

(If c = 0, then W’nVn — 0 and hence also W’nVn — 0).

(b) For k = l = 1

Wn — W, Vn — c implies Wn/ Vn — W/ c if c Ф 0,

Vn/Wn— c/W if P(W = 0) = 0.

(c) Wn — W, Vn — c, An — A implies AnWn + Vn — AW + c,

(d) Wn — W, Bn — B implies WnBnWn — WBW.

Of course, if in the above corollary W ~ N(p, £), then AW + c ~ N(Ap + c, Al, A). If W ~ N(0, Ik) and B is idempotent of rank p, then W’BW ~ x2(p).

## Leave a reply