Asymptotic Properties of Extremum Estimators

By extremum estimators we mean estimators obtained by either maximizing or minimizing a certain function defined over the parameter space. First, we shall establish conditions for the consistency and the asymptotic normality of extremum estimators (Section 4.1), and second, we shall apply the results to important special cases, namely, the maximum likelihood estimator (Section 4.2) and the nonlinear least squares estimator (Section 4.3).

What we call extremum estimators Huber called M estimators, meaning maximum-likelihood-like estimators. He developed the asymptotic proper­ties in a series of articles (summarized in Huber, 1981). The emphasis here, however, will be different from his. The treatment in this chapter is more general in the sense that we require neither independent nor identically dis­tributed random variables. Also, the intention here is not to strive for the least stringent set of assumptions but to help the reader understand the fundamen­tal facts by providing an easily comprehensible set of sufficient conditions.

In Sections 4.4 and 4.5 we shall discuss iterative methods for maximization or minimization, the asymptotic properties of the likelihood ratio and asymp­totically equivalent tests, and related topics. In Section 4.6 we shall discuss the least absolute deviations estimator, for which the general results of Section 4.1 are only partially applicable.

Leave a reply

You may use these HTML tags and attributes: <a href="" title=""> <abbr title=""> <acronym title=""> <b> <blockquote cite=""> <cite> <code> <del datetime=""> <em> <i> <q cite=""> <s> <strike> <strong>