Category A COMPANION TO Theoretical Econometrics

General Hypothesis. Testing

Anil K. Bera and Gamini Premaratne*

1 Introduction

The history of statistical hypothesis testing is, indeed, very long. Neyman and Pearson (1933) traced its origin to Bayes (1763). However, systematic applications of hypothesis testing began only after the publication of Karl Pearson’s (1900) goodness-of-fit test, which is regarded as one of the 20 most important scientific breakthroughs in this century. In terms of the development of statistical methods, Ronald Fisher took up where Pearson left off. Fisher (1922) can be regarded as the analytical beginning of statistical methods. In his paper Fisher advocated the use of maximum likelihood estimation and provided the general theory of parametric statistical inference...

A COMPANION TO THEORETICAL ECONOMETRICS

This is the first companion in econometrics. It covers 32 chapters written by international experts in the field. The emphasis of this companion is on "keeping things simple" so as to give students of econometrics a guide through the maze of important topics in econometrics. These chapters are helpful for readers and users of econometrics who are not looking for exhaustive surveys on the subject. Instead, these chapters give the reader some of the basics and point to further readings on the subject...

Some Test Principles Suggested in the Statistics Literature

We start by introducing some notation and concepts. Suppose we have n inde­pendent observations y1, y2,…, yn on a random variable Y with density function f(y; 0), where 0 is a p x 1 parameter vector with 0 Є 0 C ^p. It is assumed that f(y; 0) satisfies the regularity conditions stated in Rao (1973, p. 364) and Serfling (1980, p. 144). The likelihood function is given by

where y = (y1, y2,…, yn)’ denotes the sample.

Suppose we are interested in testing a simple hypothesis H0 : 0 = 00 against another simple hypothesis H1 : 0 = 01. Let S denote the sample space. In standard test procedures, S is partitioned into two regions, ю and its compliment юс. We reject the null hypothesis if the sample y Є ю; otherwise, we do not reject H0...

Artificial Regressions

Russell Davidson and James G. MacKinnon

1 Introduction

All popular nonlinear estimation methods, including nonlinear least squares (NLS), maximum likelihood (ML), and the generalized method of moments (GMM), yield estimators which are asymptotically linear. Provided the sample size is large enough, the behavior of these nonlinear estimators in the neighborhood of the true parameter values closely resembles the behavior of the ordinary least squares (OLS) estimator. A particularly illuminating way to see the relationship between any nonlinear estimation method and OLS is to formulate the artificial regression that corresponds to the nonlinear estimator.

An artificial regression is a linear regression in which the regressand and re­gressors are constructed as functions of the data and parame...

Neyman-Pearson generalized lemma and its applications

The lemma can be stated as follows:

Let g1, g,…, gm, gm+1 be integrable functions and ф be a test function over S such that 0 < ф < 1, and

фgidy = ci i = 1, 2,…, m,

where c1, c2,…, cm are given constants. Further, let there exist a ф* and constants k1, k2,…, km such that ф* satisfies (2.6), and

Ф* = 1 if gm+1 > X

i=1

m

= 0 if gm+1 < X kigi

i=1

 Ф*gm+1dУ ^
 the function ф*(у) defined as
 ф*(у) = 1 when = 0 when
 Ф*(у)Ц91)йу ^
 that is, ф*(у) will provide the MP test. Therefore, in terms of critical region,

then,

where k is such that Pr{o | H0} = a, is the MP critical region.

The N-P lemma also provides the logical basis for the LR test...