Category A COMPANION TO Theoretical Econometrics

An Artificial Regression for GMM Estimation

Another useful artificial regression, much less well known than the OPG regres­sion, is available for a class of models estimated by the generalized method of moments (GMM). Many such models can be formulated in terms of functions f(0) of the model parameters and the data, such that, when they are evaluated at the true 0, their expectations conditional on corresponding information sets, Qt, vanish. The Qt usually contain all information available prior to the time of observation, and so, as with the GNR and the OPG regression, lags of depend­ent variables are allowed.

Let the n x l matrix W denote the instruments used to obtain the GMM estim­ates. The tth row of W, denoted Wt, must contain variables in Qt only...

Read More

Artificial Regressions and HETEROsKEDASTiciTy

Covariance matrices and test statistics calculated via the GNR (1.7), or via arti­ficial regressions such as (1.35) and (1.36), are not asymptotically valid when the assumption that the error terms are iid is violated. Consider a modified version of the nonlinear regression model (1.3), in which E(uuT) = Q, where Q is an n x n diagonal matrix with tth diagonal element ю2 Let V denote an n x n diagonal matrix with the squared residual й] as the tth diagonal element. It has been known since the work of White (1980) that the matrix

(XTX)-1XTVX(XTX)-1 (1.37)

provides an estimator of var(S), which can be used in place of the usual estim­ator, s2(XTX)-1...

Read More

Double-Length Regressions

Up to this point, the number of observations for all the artificial regressions we have studied has been equal to n, the number of observations in the data. In some cases, however, artificial regressions may have 2n or even 3n observations. This can happen whenever each observation makes two or more contributions to the criterion function.

The first double-length artificial regression, or DLR, was proposed by Davidson and MacKinnon (1984a). We will refer to it as the DLR, even though it is no longer the only artificial regression with 2n observations. The class of models to which the DLR applies is a subclass of the one used for GMM estimation. Such models may be written as

f (y, 0) = є t, t = 1,…, n, £t ~ NID(0, 1), (1.47)

where, as before, each ft () is a smooth function that depends on...

Read More

An Artificial Regression for Binary Response Models

For binary response models such as the logit and probit models, there exists a very simple artificial regression that can be derived as an extension of the Gauss – Newton regression. It was independently suggested by Engle (1984) and Davidson and MacKinnon (1984b).

The object of a binary response model is to predict the probability that the binary dependent variable, yt, is equal to 1 conditional on some information set Qt. A useful class of binary response models can be written as

E(y 11 Qt) = Pr(y t = 1) = F(ZtP). (1.51)

Here Z t is a row vector of explanatory variables that belong to Qt, в is the vector of parameters to be estimated, and F(x) is the differentiable cumulative distribu­tion function (CDF) of some scalar probability distribution...

Read More

General Hypothesis. Testing

Anil K. Bera and Gamini Premaratne*

1 Introduction

The history of statistical hypothesis testing is, indeed, very long. Neyman and Pearson (1933) traced its origin to Bayes (1763). However, systematic applications of hypothesis testing began only after the publication of Karl Pearson’s (1900) goodness-of-fit test, which is regarded as one of the 20 most important scientific breakthroughs in this century. In terms of the development of statistical methods, Ronald Fisher took up where Pearson left off. Fisher (1922) can be regarded as the analytical beginning of statistical methods. In his paper Fisher advocated the use of maximum likelihood estimation and provided the general theory of parametric statistical inference...

Read More