Constrained Least Squares Estimator (CLS)

The constrained least squares estimator (CLS) of /?, denoted by Д is defined to be the value of P that minimizes the sum of squared residuals

S(P) = (y-XP)'(y-XP) (1A2)

under the constraints (1.4.1). In Section 1.2.1 we showed that (1.4.2)Js mini­mized without constraint at the least squares estimator fi. Writing S(fi) for the sum of squares of the least squares residuals, we can rewrite (1.4.2) as

S(P) = S(P) + (P-P)’X’X(P-P). (1.4.3)

Instead of directly minimizing (1.4.2) under (1.4.1), we minimize (1.4.3) under ^1.4.1), which is mathematically simpler.

Put p — P = 6 and Q’ji—c = y. Then, because S(P) does not depend on P, the problem is equivalent to the minimization of S’X’XS under Q’S = y. Equating the derivatives of ‘ X’ X<H – 2A’ (Q’ ^ — y) with respect to S and the ^-vector of Lagrange multipliers A to zero, we obtain the solution

S = (X’ X)- *Q[Q’ (X’ X)- ’Q]- ‘y. (1.4.4)

Transforming from S and у to the original variables, we can write the mini­mizing value P of S(P) as

p = p-(X’X)~lQ[Q’ (X’X)-!Q]-l(Q’P – с). (1.4.5)

The corresponding estimator of a2 can be defined as

o2=T~y-xp)’{y-Xfi). (1.4.6)

It is easy to show that the P and a2 are the constrained maximum likelihood estimators if we assume normality of u in Model 1.

Leave a reply

You may use these HTML tags and attributes: <a href="" title=""> <abbr title=""> <acronym title=""> <b> <blockquote cite=""> <cite> <code> <del datetime=""> <em> <i> <q cite=""> <s> <strike> <strong>