Multicollinearity and the Astatistical Power of Regression Analysis.

by Joseph P. Newhouse

Purchase Print Copy

 FormatList Price Price
Add to Cart Paperback7 pages $20.00 $16.00 20% Web Discount

A note on the effect of multicollinearity on the power of regression analysis. (1) In general, collinearity lowers the power of statistical tests of significance in regression analysis. However, if one of two collinear variables is incorrectly dropped, frequently the null hypotheses of no relationship is less likely to be accepted than if the variables had been orthogonal. (2) If only one of two potential explanatory variables actually belongs in the model, the correct one will have: (a) the higher expected t-statistic if two simple regressions are run, each with one of the two variables as an explanatory variable; (b) the higher expected t-statistic if both variables are included. Increasing the degree of collinearity can, however, make the expected values of the t-statistics arbitrarily close. In such a case, sampling error can be a major determinant of which has the higher t-statistic. 7 pp. (AR)

This report is part of the RAND Corporation paper series. The paper was a product of the RAND Corporation from 1948 to 2003 that captured speeches, memorials, and derivative research, usually prepared on authors' own time and meant to be the scholarly or scientific contribution of individual authors to their professional fields. Papers were less formal than reports and did not require rigorous peer review.

The RAND Corporation is a nonprofit institution that helps improve policy and decisionmaking through research and analysis. RAND's publications do not necessarily reflect the opinions of its research clients and sponsors.