# Category A COMPANION TO Theoretical Econometrics

## A COMPANION TO THEORETICAL ECONOMETRICS

This is the first companion in econometrics. It covers 32 chapters written by international experts in the field. The emphasis of this companion is on "keeping things simple" so as to give students of econometrics a guide through the maze of important topics in econometrics. These chapters are helpful for readers and users of econometrics who are not looking for exhaustive surveys on the subject. Instead, these chapters give the reader some of the basics and point to further readings on the subject...

Read More

## Some Test Principles Suggested in the Statistics Literature  We start by introducing some notation and concepts. Suppose we have n inde­pendent observations y1, y2,…, yn on a random variable Y with density function f(y; 0), where 0 is a p x 1 parameter vector with 0 Є 0 C ^p. It is assumed that f(y; 0) satisfies the regularity conditions stated in Rao (1973, p. 364) and Serfling (1980, p. 144). The likelihood function is given by

where y = (y1, y2,…, yn)’ denotes the sample.

Suppose we are interested in testing a simple hypothesis H0 : 0 = 00 against another simple hypothesis H1 : 0 = 01. Let S denote the sample space. In standard test procedures, S is partitioned into two regions, ю and its compliment юс. We reject the null hypothesis if the sample y Є ю; otherwise, we do not reject H0...

Read More

## Artificial Regressions

Russell Davidson and James G. MacKinnon

1 Introduction

All popular nonlinear estimation methods, including nonlinear least squares (NLS), maximum likelihood (ML), and the generalized method of moments (GMM), yield estimators which are asymptotically linear. Provided the sample size is large enough, the behavior of these nonlinear estimators in the neighborhood of the true parameter values closely resembles the behavior of the ordinary least squares (OLS) estimator. A particularly illuminating way to see the relationship between any nonlinear estimation method and OLS is to formulate the artificial regression that corresponds to the nonlinear estimator.

An artificial regression is a linear regression in which the regressand and re­gressors are constructed as functions of the data and parame...

Read More

## Neyman-Pearson generalized lemma and its applications

The lemma can be stated as follows:

Let g1, g,…, gm, gm+1 be integrable functions and ф be a test function over S such that 0 < ф < 1, and фgidy = ci i = 1, 2,…, m,

where c1, c2,…, cm are given constants. Further, let there exist a ф* and constants k1, k2,…, km such that ф* satisfies (2.6), and

Ф* = 1 if gm+1 > X

i=1

m = 0 if gm+1 < X kigi

i=1 Ф*gm+1dУ ^
 the function ф*(у) defined as
 ф*(у) = 1 when = 0 when
 Ф*(у)Ц91)йу ^
 that is, ф*(у) will provide the MP test. Therefore, in terms of critical region, then,

where k is such that Pr{o | H0} = a, is the MP critical region.

The N-P lemma also provides the logical basis for the LR test...

Read More

## The Concept of an Artificial Regression

Consider a fully parametric, nonlinear model that is characterized by a para­meter vector 0 which belongs to a parameter space 0 C R k and which can be estimated by minimizing a criterion function Q(0) using n observations. In the case of a nonlinear regression model estimated by nonlinear least squares, Q(0) would be one half the sum of squared residuals, and in the case of a model estim­ated by maximum likelihood, Q(0) would be minus the loglikelihood function.

If an artificial regression exists for such a model, it always involves two things: a regressand, r(0), and a matrix of regressors, R(0). The number of regressors for the artificial regression is equal to k, the number of parameters...

Read More