A note on the effect of multicollinearity on the power of regression analysis. (1) In general, collinearity lowers the power of statistical tests of significance in regression analysis. However, if one of two collinear variables is incorrectly dropped, frequently the null hypotheses of no relationship is less likely to be accepted than if the variables had been orthogonal. (2) If only one of two potential explanatory variables actually belongs in the model, the correct one will have: (a) the higher expected t-statistic if two simple regressions are run, each with one of the two variables as an explanatory variable; (b) the higher expected t-statistic if both variables are included. Increasing the degree of collinearity can, however, make the expected values of the t-statistics arbitrarily close. In such a case, sampling error can be a major determinant of which has the higher t-statistic. 7 pp. (AR)
This report is part of the RAND Corporation Paper series. The paper was a product of the RAND Corporation from 1948 to 2003 that captured speeches, memorials, and derivative research, usually prepared on authors' own time and meant to be the scholarly or scientific contribution of individual authors to their professional fields. Papers were less formal than reports and did not require rigorous peer review.
This document and trademark(s) contained herein are protected by law. This representation of RAND intellectual property is provided for noncommercial use only. Unauthorized posting of this publication online is prohibited; linking directly to this product page is encouraged. Permission is required from RAND to reproduce, or reuse in another form, any of its research documents for commercial purposes. For information on reprint and reuse permissions, please visit www.rand.org/pubs/permissions.
The RAND Corporation is a nonprofit institution that helps improve policy and decisionmaking through research and analysis. RAND's publications do not necessarily reflect the opinions of its research clients and sponsors.