# Joint Confidence Intervals and Test of Hypotheses

We have learned how to test any single hypothesis involving any linear combination of the в’s. But what if we are interested in testing two or three or more hypotheses involving linear combinations of the в’s. For example, testing that в2 = в4 = 0, i. e., that variables X2 and X4 are not significant in the model. This can be written as c’2в = c’4в = 0 where cj is a row vector of zeros with a one in the j-th position. In order to test these two hypotheses simultaneously, we rearrange these restrictions on the ^s in matrix form Rв = 0 where R’ = [c2,c4]. In a similar fashion, we can rearrange g restrictions on the ^s into this matrix R which will now be of dimension (g x k). Also these restrictions need not be of the form Rв = 0 and can be of the more general form Rв = r where r is a (g x 1) vector of constants. For example, в1 + в2 = 1 and 3вз + 2в4 = 5 are two such restrictions. Since Rв is a collection of linear combinations of the в’s, the BLUE of these is R@ols and the latter is distributed N(Rв, a2R(X’X)-1R’). Standardization of the form encountered with the scalar c’в gives us the following: (RPols — Rв)'[R(X’X )-1R’]-1(r3ols — Rв)/a2

rather than divide by the variance we multiply by its inverse, and since we divided by the variance rather than the standard deviation we square the numerator which means in vector form premultiplying by its transpose. Problem 9 replaces the matrix R by the vector c’ and shows that (7.29) reduces to the square of the ^-statistic observed in (7.26). This also proves that the resulting statistic is distributed as x2. But, what is the distribution of (7.29)? The trick is to write it in terms of the original disturbances, i. e.,

u’X (Xі X )-1R'[R(X’X ^R’^RX’X )-lX’u/a2 (7.30)

where (RfioLS — R0) is replaced by R(XIX)-1Xlu. Note that (7.30) is quadratic in the dis­turbances u of the form ulAu/a2. Problem 10 shows that A is symmetric and idempotent and of rank g. Applying the same proof as given below lemma 1 we get the result that (7.30) is distributed as x2. Again a2 is unobserved, so we divide by (n — k)s2/a2 which is хП-k. This becomes a ratio of two x2’s random variables. If we divide the numerator and denominator x2’s by their respective degrees of freedom and prove that they are independent (see problem 11) the resulting statistic

(RPols — r)l[R(XlX )-1Ri]-1(RPols — r)/gs2 (7.31)

is distributed under the null R/3 = r as an F(g, n — k).