Skip to main content icon/video/no-internet

Collinearity between two INDEPENDENT VARIABLES or multicollinearity between multiple independent variables in LINEAR REGRESSION analysis means that there are linear relations between these variables. In a vector model, in which variables are represented as vectors, exact collinearity would mean that two vectors lie on the same line or, more generally, for k variables, that the vectors lie in a subspace of a dimension less than k (i.e., that one of the vectors is a linear combination of the others). The result of such an exact linear relation between variables X would be that the cross-product matrix XX is singular (i.e., its determinant would be equal to zero), so that it would have no inverse. In practice, an exact linear relationship rarely occurs, but the interdependence of social phenomena may result in “approximate” linear relationships. This phenomenon is known as multicollinearity.

In the analysis of interdependence, such as PRINCIPAL COMPONENTS and FACTOR ANALYSIS, multicollinearity is blissful because dimension reduction is the objective, whereas in the analysis of dependence structures, such as MULTIPLE REGRESSION analysis, multicollinearity among the independent variables is a problem. We will treat it here in the context of regression analysis.

Consequences

Imagine an investigation into the causes of wellbeing of unemployed workers (measured as a scale from 0 to 100). A great number of explanatory variables will be taken into account, such as annual income (the lower the annual income, the lower the well-being), duration of unemployment (the higher the number of weeks of unemployment, the lower the well-being), and others. Now, some of these independent variables will be so highly correlated that conclusions from a multiple regression analysis will be affected. High multicollinearity is undesirable in a multicausal model because standard errors become much larger and CONFIDENCE INTERVALS much wider, thus affecting SIGNIFICANCE TESTING and STATISTICAL INFERENCE, even though the ORDINARY LEAST SQUARES (OLS) estimator still is unbiased. Like variances and covariances, multicollinearity is a sample property. This means analyses may be unreliable and sensitive toafew outliers or extreme cases.

The problem of correlated independent variables in a causal model is an old sore in social scientific research. In multiple regression analysis, it is referred to as multicollinearity (the predictors in a vector model almost overlap in a straight line), whereas in ANALYSIS OF VARIANCE and COVARIANCE, one tends to use the term nonorthogonality (the factors of the factor-response model are not perpendicular to each other). Multicollinearity and nonorthogonality are, in fact, synonyms.

Let us emphasize that strange things may happen in an analysis with high multicollinearity. Kendall and Stuart (1948) provided an extreme example of one dependent variable Y and two independent variables X1 and X2, in which the correlations YX1 and YX2 are weak (one of them even zero) and the correlation X1X2 is very strong (0.90, high multicollinearity), whereas the multiple correlation coefficient Ry·12 is approximately 1. This classic example shows a classic consequence of multicollinearity: insignificant results of testing individual parameters but high R-SQUARED values. The standardized coefficients (a direct function of sample variances), which normally cannot be greater than 1 because their squared values express a proportion of explained variance, exceed 2; consequently, both variables appear to explain each more than 400%, with only 100% to be explained!

...

  • Loading...
locked icon

Sign in to access this content

Get a 30 day FREE TRIAL

  • Watch videos from a variety of sources bringing classroom topics to life
  • Read modern, diverse business cases
  • Explore hundreds of books and reference titles

Sage Recommends

We found other relevant content for you on other Sage platforms.

Loading