Skip to main content icon/video/no-internet

In multiple regression, a single dependent variable (y, the criterion) is predicted from a number of independent variables. If there were only a single independent variable (x1), the regression coefficient b is simply a rescaling of the correlation coefficient rxy (i.e., b = rxy). Therefore, rxy can be used to gauge the importance of the single predictor variable. With multiple predictor variables, the correlations among these predictors (i.e., multicollinearity), as well as the correlations of the predictors with the criterion, influence the regression weights. If one is interested only in optimizing the overall fit of the model (i.e., R2), then the regression coefficients do not require much interpretation. However, understanding the effect of each individual variable is very difficult. It can also be the case that variables significantly correlated with the criterion do not enter the model because another variable is more strongly associated with the criterion and the additional variance explained is insufficient to achieve statistical significance.

One approach to this problem is to conduct a principal component analysis. All dimensions are independent but, unfortunately, hard to interpret with respect to the original variables. Alternatively, the predictor variables can be orthogonalized by means of a transformation so that all predictor variables are uncorrelated with each other, but simultaneously each transformed predictor variable is maximally correlated with the corresponding original variable. If the correlations of the transformed variables with the original predictor variables are high, the results of the orthogonal regression analysis are easy to interpret. The transformation provides variables that are pure in that no variable shares any common variance with any other in the explanatory set. The regression model with the orthogonal predictors is insensitive to the order in which the independent variables are entered into the equation. The explanatory power of each predictor is directly interpretable as the squared criterion correlation.

The transformation of the original variables is obtained from the matrix equation Z∗ = ZR–1/2 so that the correlation matrix (R∗) for the orthogonalized predictors in the columns of Z∗ is (Z∗)'Z∗ / N = I the identity matrix (i.e., indicates orthogonality or 0 correlation), where Z is the 0 mean and standard

deviation of 1 (i.e., z score) matrix of original variables and R–1/2 is the square root of the inverse of the correlation matrix R among the original variables. The correlation of the orthogonally transformed predictor variables with the original predictor variables is the square root of the correlation matrix R among the original predictor variables R1/2.

The amount of predictable variance accounted for by each predictor variable in orthogonal regression is the correlation between the orthogonalized predictor and the criterion, and the overall fit of the model (i.e., R2) is simply the sum of the individual r2 for each variable in the model.

John R.Reddon and James S.Ho

Further Reading

Cawsey, T. F., Reed, P. L., and Reddon, J. R.Human needs and job satisfaction: A multidimensional approach. Human Relations35703–715 (1982). http://dx.doi.org/10.1177/001872678203500901
  • Loading...
locked icon

Sign in to access this content

Get a 30 day FREE TRIAL

  • Watch videos from a variety of sources bringing classroom topics to life
  • Read modern, diverse business cases
  • Explore hundreds of books and reference titles

Sage Recommends

We found other relevant content for you on other Sage platforms.

Loading