# Closing Remarks

We conclude by pointing out the main lessons of this essay. First, we have tools, the Belsley, Kuh, and Welsch (1980) collinearity diagnostics, which allow us to determine the form and severity of collinearity in the linear regression model. Most importantly, we know which variables are involved in collinear relationships, and which variables are not involved in collinear relationships. If the least squares estimator is severely affected by collinearity, but the model’s variables of interest are not involved in the collinear relationships, then there is no call for remedial actions. Such a conclusion requires us to think clearly about our models, and to pinpoint key variables.

Since new and better data are rarely available, the only practical approach to mitigating harmful collinearity is the introduction of nonsample information about the parameters, based on prior empirical research or economic theory. However the information is introduced, whether it be via restricted least squares, the Bayesian approach, or maximum entropy estimation, we must endeavor to introduce "good" nonsample information. The difficulty with this statement is that we never truly know whether the information we introduce is good enough to reduce estimator mean square error, or not.

The analysis of collinearity in nonlinear models is difficult. Collinearity (ill – conditioning) in asymptotic covariance matrices may arise from collinearity in the matrix of explanatory variables X, and/or particular parameter values and function values. Identifying the cause of the ill-conditioning may, or may not, be possible, but again the use of good nonsample information would seem the only remedy. In nonlinear models the problem of collinearity spills over into the estimation process, because the iterative algorithms used for numerical optimization may be sensitive to it. When this occurs, consider alternative algorithms, because how we find the maximum or minimum of our objective function is not important. Estimator properties only depend upon the successful location of the global maximum.

Note

* The authors wish to thank three anonymous referees for their helpful comments. All remaining errors are the authors’ own.

References

Almon, S. (1965). The distributed lag between capital appropriations and expenditures. Econometrica 33, 178-96.

Belsley, D. A. (1982). Assessing the presence of harmful collinearity and other forms of weak data through a test for signal-to-noise. Journal of Econometrics 20, 211-53.

Belsley, D. A. (1984). Demeaning conditioning diagnostics through centering. American Statistician 38, 73-93.

Belsley, D. A. (1991). Collinearity Diagnostics: Collinearity and Weak Data in Regression. New York: Wiley.

Belsley, D. A., E. Kuh, and R. E. Welsch (1980). Regression Diagnostics: Identifying Influential Data and Sources of Collinearity. New York: Wiley.

Blanchard, O. J. (1987). Comment. Journal of Business and Economic Statistics 5, 449-51.

Buck, A. J., and S. Hakim (1981). Appropriate roles for statistical decision theory and hypothesis testing in model selection: An exposition. Regional Science and Urban Economics 11, 135-47.

Buse, A. (1994). Brickmaking and the collinear arts: a cautionary tale. Canadian Journal of Economics 27, 408-14.

Chatterjee, S. and A. S. Hadi (1988). Sensitivity Analysis in Linear Regression. New York: Wiley.

Cook, R. D. and S. Weisberg (1982). Residuals and Influence in Regression. London: Chapman & Hall.

Davidson, R. and J. G. MacKinnon (1993). Estimation and Inference in Econometrics. New York: Oxford University Press.

Dorfman, J. H. (1997). Bayesian Economics through Numerical Methods: A Guide to Econometrics and Decision-Making with Prior Information. New York: Springer.

Fomby, T. B., R. C. Hill, and S. R. Johnson (1978). An optimality property of principal components regression. Journal of the American Statistical Association 73, 191-3.

Fomby, T. B., R. C. Hill, and S. R. Johnson (1984). Advanced Econometric Methods. New York: Springer-Verlag.

Fox, J. and G. Monette (1992). Generalized collinearity diagnostics. Journal of the American Statistical Association 87, 178-83.

Golan, A., G. G. Judge, and D. Miller (1996). Maximum Entropy Econometrics: Robust Estimation with Limited Data. New York: John Wiley and Sons.

Greene, W. (1997). Econometric Analysis, 3rd edn. Upper Saddle River, NJ: Prentice Hall.

Hadi, A. S. and M. T. Wells (1990). Assessing the effects of multiple rows on the condition of a matrix. Journal of the American Statistical Association 85, 786-92.

Hoerl, A. E., R. W. Kennard, and K. F. Baldwin (1975). Ridge regression: Some simulations. Communications in Statistics, A, 4 105-23.

Judge, G. G. and M. E. Bock (1978). Statistical Implications of Pretest and Stein-Rule Estimators in Econometrics. Amsterdam: North-Holland.

Judge, G. G. and M. E. Bock (1983). Biased Estimation. In Z. Griliches and M. D. Intrilligator (eds.), Handbook of Econometrics, Volume 1. Amsterdam: North-Holland.

Judge, G. G., W. E. Griffiths, R. C. Hill, H. Lutkepohl, and T. C. Lee (1985). The Theory and Practice of Econometrics, 2nd edn. New York: John Wiley and Sons, Inc.

Judge, G. G., R. C. Hill, W. E. Griffiths, H. Lutkepohl, and T. C. Lee (1988). Introduction to the Theory and Practice of Econometrics, 2nd edn. New York: John Wiley and Sons, Inc.

Judge, G. G. and T. A. Yancy (1986). Improved Methods of Inference in Econometrics. Amsterdam: North-Holland.

Kennedy, P. (1982). Eliminating problems caused by multicollinearity: A warning. Journal of Economic Education 13, 62-4.

Kennedy, P. (1983). On an inappropriate means of reducing multicollinearity. Regional Science and Urban Economics 13, 579-81.

Kennedy, P. (1998). A Guide to Econometrics, 4th edn. Cambridge: MIT Press.

Lawless, J. F. and P. Wang (1976). A simulation study of ridge and other regression estimators. Communications in Statistics A 5, 307-23.

Leamer, E. E. (1978). Specification Searches: Ad Hoc Inference with Nonexperimental Data. New York: Wiley.

Lee, Kyung Yul and L. A. Weissfeld (1996). A multicollinearity diagnostic for the Cox model with time dependent covariates. Communications in Statistics – Simulation 25, 41-60.

Lesaffre, E. and B. D. Marx (1993). Collinearity in generalized linear regression. Communications in Statistics – Theory and Methods 22, 1933-52.

Mackinnon, M. J. and M. L. Puterman (1989). Collinearity in generalized linear models. Communications in Statistics – Theory and Methods 18, 3463-72.

McCullagh, P. and J. A. Nelder (1989). Generalized Linear Models, 2nd edn. London: Chapman and Hall.

Mason, R. L. and R. F. Gunst (1985). Outlier-induced collinearities. Technometrics 27, 401-7.

Segerstedt, B. and H. Nyquist (1992). On the conditioning problem in generalized linear models. Journal of Applied Statistics 19, 513-22.

Sengupta, D. (1995). Optimal choice of a new observation in a linear model. Sankhya: The Indian Journal of Statistics Series A 57, 137-53.

Sengupta, D. and P. Bhimasankaram (1997). On the roles of observations in collinearity in the linear model. Journal of the American Statistical Association 92, 1024-32.

Silvey, S. (1969). Multicollinearity and imprecise estimation. Journal of the Royal Statistical Society B 31, 539-52.

Soofi, E. S. (1990). Effects of collinearity on information about regression coefficients. Journal of Econometrics 43, 255-74.

Stewart, G. W. (1987). Collinearity and least squares regression. Statistical Science 1, 68-100.

Theil, H. and A. Goldberger (1961). On pure and mixed statistical estimation in economics. International Economic Review 2, 65-78.

Toro-Vizcarrondo, C. and T. Wallace (1968). A test of the mean square error criterion for restrictions in linear regression. Journal of the American Statistical Association 63, 558-76.

Weissfeld, L. A. and S. M. Sereika (1991). A multicollinearity diagnostic for generalized linear models. Communications in Statistics A 20, 1183-98.

Zellner, A. (1971). An Introduction to Bayesian Inference in Econometrics. New York: John Wiley and Sons.

## Leave a reply