Monte Carlo tests in the presence of nuisance parameters: examples from the multivariate regression model

In this section, we provide examples from Dufour and Khalaf (1998a, 1998b) pertaining to LR test criteria in the MLR (reduced form) model. The model was introduced in Section 2.3. Consider the three equations system

Y1 = P10 + P11X1 + U1,

Y2 = P20 + P22X2 + U2,

Y3 = P30 + P33X3 + U3, (23.38)

imposing normality, and the hypothesis H0 : pn = P22 = P33. First restate H0 in

terms of the MLR model which includes the SURE system as a special case, so that it incorporates the SURE exclusion restrictions. Formally, in the framework of the MLR model

Y1 = P10 + PuX1 + PuX2 + p13X3 + U1,

Y2 = P20 + P21X1 + P22X2 + P23X3 + U2,

Y3 = P30 + P31X1 + p32X2 + P33X3 + U3, (23.39)

H0 is equivalent to the joint hypothesis

H* : P11 = P22 = P33 and P12 = P13 = P21 = P23 = P31 = P32 = °. (23.40)

The associated LR statistic is

image338

(23.41) where 10 and 1 are the restricted and unrestricted SURE MLE. We also consider

LR[17] = n 1п(Л*), Л* = | І0І/І lu |, (23.42)

where 1 is the unconstrained estimate of 1 in the "nesting" MLR model. Since the restricted model is the same in both LR and LR*, while the unrestricted model used in LR* includes as a special case the one in LR, it is straightforward to see that LR < LR*, so that the distribution function of LR* provides an upper bound on the distribution function of LR.

In order to apply a BMC test, we need to construct a set of UL restrictions that satisfy (23.40) so that the corresponding LRc criterion conforming with these UL restrictions yields a valid bound on the distribution of LR. Indeed, as empha­sized above, the LR test statistic for UL restrictions is pivotal. Furthermore, by considering UL restrictions obtained as a special case of H*, we can be sure that the associated statistic is always > LR. Here, it is easy to see that the constraints setting the coefficients P!?, i, j = 1,…, 3, to specific values meet this criterion. Note that the statistic just derived serves to bound both LR and LR*.

image591 Подпись: P20 P21 P22 P23 Подпись: P30 P31 P32 P33 image594

Define 0 = C(1) as the vector of the parameters on or below the diagonal of the Cholesky factor T(1) of the covariance matrix 1 (i. e. T(1) is the lower triangular matrix such that T(1)T(1)’ = 1). The algorithm for performing MC tests based on LR*, at the 5 percent level with 99 replications, can be described as follows.

where S11 = S22 = S33 are the constrained SURE estimates calculated in the first step.

• Call the bound MC procedure BMC(0), described below, for 0 = C(10). The Cholesky decomposition is used to impose positive definiteness and avoid redundant parameters. The output is the BMC p-value. Reject the null if the latter is < .05 and STOP.

• Otherwise, call the procedure MC(0), also described below, for 0 = C(X 0). It is important to note here that X is the only relevant nuisance parameter, for the example considered involves linear constraints (see Breusch, 1980). The output is the LMC p-value. Declare the test not significant if the latter exceeds .05 and STOP.

• Otherwise, call the maximization algorithm (for example, SA) for the func­tion MC(0) using 0 = C(X0) as a starting value. Obtain the MMC p-value and reject the null if the latter is < .05. Note: if only a decision is required, the maximization algorithm may be instructed to exit as soon as a value larger than.05 is attained. This may save considerable computation time.

Description of the procedure MC(0):

• Construct a triangular Q from 0 (this gives the Cholesky decomposition of the variance which will be used to generate the simulated model).

• Do for j = 1,…, N (independently)

(a) Generate the random vectors Y/ j) Y2( j) Y3( j) conformably with the nesting MLR model, using the restricted SURE coefficient estimates, U( j), the observed regressors, and Q.

(b) Estimate the MLR model with the observed regressors as dependent variable, and Y1j) Y2j) Y j as independent variables: obtain the unre­stricted estimates and the estimates imposing H0.

(c) From these estimates, form the statistics LR( j) and store.

• Obtain the rank of LR* in the series LR*, LR*(1),…, LR*(99).

• This yields a MC p-value as described above which is the output of the procedure.

• The BMC(0) procedure may be obtained as just described, replacing LR*( j) by LR(c j). Alternatively, the BMC procedure may be rewritten following the methodology relating to MC tests of UL hypotheses so that no (unknown) parameters intervene in the generation of the simulated (bounding) statistics. Indeed, the bounding statistic satisfies (23.27) under (23.19). Thus LRCj) may be obtained using draws from, e. g., the multivariate independent normal distribution.

In Dufour and Khalaf (1998c), we report the results of a simulation experiment designed according to this example. In particular, we examine the performance of LMC and BMC tests. We show that the MC test procedure achieves perfect size control and has good power. The same methodology may also be applied in simultaneous equations models such as (23.3). In Dufour and Khalaf (1998b), we present simulations which illustrate the performance of limited-information LR – based MC tests in this context. We have attempted to apply the MC test proce­dure to the IV-based Wald-type test for linear restrictions on structural parameters. In this case, the performance of the standard bootstrap was disappointing. The LMC Wald tests failed completely in near-unidentified conditions. Furthermore, in all cases examined, the Wald tests maximal randomized p-values were always one. This is a case (refer to Section 2.3) where the MC procedure does not (and cannot) correct the performance of the test.

In other words, Wald statistics do not constitute valid pivotal functions in such models and it is even impossible to bound their distributions over the parameter space (except by the trivial bound 0 and 1). (Dufour, 1997)

These results are also related to the non-invariance problems associated with Wald tests in nonlinear contexts (see, e. g. Dufour, 1997; Dagenais and Dufour, 1991). Indeed, it is evident from (23.3)-(23.5) that seemingly linear constraints on structural coefficients in instrumental regressions often involve nonlinear hypoth­eses implied by the structure. Of course, not all Wald tests will suffer from such problems. For instance, Wald tests for linear restrictions in linear regression models yield exact F-tests.

We conclude this section with a specific problem where the MC test strategy conveniently solves a difficult and non-standard distributional problem: the prob­lem of unidentified nuisance parameters.

Leave a reply

You may use these HTML tags and attributes: <a href="" title=""> <abbr title=""> <acronym title=""> <b> <blockquote cite=""> <cite> <code> <del datetime=""> <em> <i> <q cite=""> <s> <strike> <strong>