# Useful Theorems in Matrix Analysis

The theorems listed in this appendix are the ones especially useful in econometrics. All matrices are assumed to be real. Proofs for many of these theorems can be found in Bellman (1970).

1. For any square matrix A, with distinct characteristic roots, there exists a nonsingular matrix P such that PAP-1 = Л, where Л is a diagonal matrix with the characteristic roots of A in the diagonal. If the characteristic roots are not distinct, Л takes the Jordan canonical form (see Bellman, p. 198).

2. The determinant of a matrix is the product of its characteristic roots.

3. For any symmetric matrix A, there exists an orthogonal matrix H such that H’ H = I and H ‘AH = Л, where Л is a diagonal matrix with the characteristic roots (which are real) of A in the diagonal. The zth column of H is called the characteristic vector of A corresponding to the characteristic root of A that is the zth diagonal element of Л.

4. For a symmetric matrix A, the following statements are equivalent:

(i) A is a positive definite. (Write A > 0.)

(ii) x’Ax is positive for any nonzero vector x.

(iii) Principal minors of A are all positive.

(iv) Characteristic roots of A are all positive.

The above is true if we change the word positive to nonnegative.

5. For any matrices A and B, the nonzero characteristic roots of AB and BA are the same, whenever both AB and BA are defined.

6. tr AB = tr BA.

7. For any square matrix A, tr A is equal to the sum of its characteristic roots.

8. Let A, В be symmetric matrices of the same size. A necessary and

sufficient condition that there exists an orthogonal matrix H such that both H’AH and H’BH are diagonal is that AB = BA.

9. Any nonnegative (semipositive) definite matrix A can be written as A = TT’, where T is a lower triangular matrix with nonnegative diagonal elements.

10. Let A,, A2,. • • , A„ be the characteristic roots of a symmetric matrix A in descending order (A! being the largest). Then

11. Let A and В be symmetric matrices (n X ri) with В nonnegative definite. Then fit(A + B) ^ A,(A), /,= 1,2,. . . , n, where A’s and //’s are the characteristic roots in descending order. The strict inequality holds if В is positive definite.

12. det^ ^“IDHA-BD^CI if |D|Ф0.

where E = A – BD-1C, F = D – CA-‘B, E-1 = A"1 + A-‘BF^CA1, and F-1 = D1 + D-1CE_,BD_1.

14. Let A be an n X r matrix of rank r. A matrix of the form P = A(A, A)~1A’ is called a projection matrix and is of special importance in statistics.

(i) P = P’ = P2 (Hence, P is symmetric and idempotent.)

(ii) rank (P) = r.

(iii) Characteristic roots of P consist of r ones and n — r zeros.

(iv) If x = Ac for some vector c, then Px = x. (Hence the word projection.)

(v) M = I — A(A’ A)-1A’ is also idempotent, with rank n — r, its characteristic roots consisting of n — r ones and r zeros, and if x = Ac, then Mx = 0.

(vi) P can be written as G’G, where GG’ = I, or as v, v’, + v2v£ + . . . + vrv’, where v, is a vector and r = rank (P).

15. LetAbeann X r matrix of rank r. Partition A as A = (Alf A2) such that Aj is n X r, and A2 is n X r2. If we define =_[I — A^A’, A1)_1A;]A2, then we have A(A’A)_1A’ = A,(A; A^-‘A; + A2(A2A2)_1A£.

16. If A is positive definite and В is symmetric, there exists a nonsingular matrix C such that С’ AC — I, С’ BC = D, where D is diagonal and the diagonal elements of D are the roots of |B — AA| = 0.

17. Let A and В be symmetric nonsingular matrices of the same size. Then A^B^O implies В-1 ё A-1.

18. Let A be any nonsingular matrix. The characteristic roots of A-1 are the reciprocals of the characteristic roots of A.

19. Let A and В be nonsingular matrices of the same size such that A + В and A-1 + B_I are nonsingular. Then

(i) (A + B)-1 = A_1(A_1 + B-1)-1B-1

(ii) A-1 – (A + В)"1 = А-ЧА"1 + B-,)_IA_I.

20. Let X be a matrix with a full column rank. Then

(I + XX’)-1 = I — X(I + X’X)-^’.

21.

Rules for Matrix Differentiation. S, is an element of X. Every other symbol is a matrix. II II denotes the absolute value of the determinant.

(vi) -£-tr A-‘B = – A-1BA-* aA

-7— tr AB = В’ dA

-j – tr ABA C = CAB + C AB’ dA

-г – tr A’BAC = ВАС’ + ВАС dA

22. Kronecker Products. Let A = {a,-,} be a KXL matrix and let В be a MX N matrix. Then the Kronecker product A ® В is a KMXLN matrix defined by

(i) (A ® B)(C ® D) = AC ® BD if AC and BD can be defined.

(ii) tr (A ® B) = tr A • tr В if A and В are square.

(iii) Let A be a KXK matrix with characteristic roots

A,, A2,………. A* and let В be а А/ X Af matrix with characteristic

roots/i,,/^,. . . , Pm – Then the AM characteristic roots of A © В arektPj, / =1,2,. . . ,K, j= 1, 2,M.

(iv) Let A and В be as in (iii). Then

|A©B| = |A|"-|B|a:.

(v) Let A, B, and C be matrices of appropriate sizes such that ABC can be defined. Suppose A has L columns and write the L columns as A = (A,, A2,. . . , Al). Define vec (A) = (A{, A’2,. . . , A£)’. Then

vec (ABC) = (C’ ® A) vec (B).

23. Hadamard Products. (Mine and Marcus, 1964, p. 120.) Let A = {atj) andB = {ft,7} be matrices of the same size. Then the Hadamard product A * В is defined by A * В = {ayfty}.

(i) Let A and В be both nXn. Then A * В = S'(A ® B)S, where S is the и2 X n matrix the z’th column of which has 1 in its [(/ — 1)л + i]th position and 0 elsewhere.

## Leave a reply