# Collinearity

R. Carter Hill and Lee C. Adkins*

Multicollinearity is God’s will, not a problem with OLS or statistical techniques in general.

Blanchard (1987, p. 49)

Collinearity, a devilish problem to be sure, receives the blame for a substantial amount of inconclusive, weak, or unbelievable empirical work. Social scientists are, for the most part, nonexperimental scientists. We do not have the luxury of designing and carrying out the experiments that generate our data. Consequently our data are often weak and not up to the task of isolating the effect of changes in one economic variable upon another. In regression models the least squares estimates may have the wrong sign, be sensitive to slight changes in the data or the model specification, or may not yield statistically significant results for theoretically important explanatory variables. These symptoms may appear despite significant values for the overall F-test of model significance or high R1 values. These are commonly cited consequences of a "collinearity problem."

In the context of the linear regression model, collinearity takes three distinct forms. First, an explanatory variable may exhibit little variability. Intuitively, this is a problem for regression because we are trying to estimate the effect of changes in the explanatory variable upon the dependent variable. If an explanatory variable does not vary much in our sample, it will be difficult to estimate its effect. Second, two explanatory variables may exhibit a large correlation. In this case, the attempt to isolate the effect of one variable, all other things held constant, is made difficult by the fact that in the sample the variable exhibits little independent variation. The correlation between two explanatory variables implies that changes in one are linked to changes in the other, and thus separating out their individual effects may be difficult. Third, and generally, there may be one, or more, nearly exact linear relationship among the explanatory variables. As in the case when

two explanatory variables are correlated, such relationships obscure the effects of involved variables upon the dependent variable. These are the three faces of collinearity.

In this chapter we explore collinearity in linear and nonlinear models. In Section 2 we present the basics, examining the forms that collinearity may take and the damage it does to estimation. The variance decomposition of Belsley, Kuh, and Welsch (1980) (hereinafter BKW) is presented in Section 3, and other collinearity diagnostics and issues are considered in Section 4. Section 5 reviews suggested remedies for collinearity problems. In Section 6 we examine the problems of collinearity in nonlinear models. Summary remarks are contained in Section 7.

## Leave a reply