Free Trial

Safari Books Online is a digital library providing on-demand subscription access to thousands of learning resources.


Share this Page URL
Help

10.3 Multicollinearity > 10.3 Multicollinearity - Pg. 311

Multiple Regression 311 this equation results in a software error message. This makes sense, because we cannot change any one of these explanatory variables while holding the other two constant. In this situation, where explanatory variables are related by their very definition, the solution is simple: Omit one of the redundant variables. In our example, we can simplify the model to C = + 1 Y + 2 Yð-1Þ + The situation is more difficult when the explanatory variables are not related by definition but happen to be correlated in our data. This kind of multicollinearity problem can be diagnosed by regressing each explanatory variable on the other explanatory variables to see if the R 2 values reveal high intercorrelations among the explanatory variables. While this diagnosis can explain the model's high standard errors and low t values, there is no statistical cure. A builder cannot make solid bricks without good clay, and a statistician cannot make precise estimates without informative data. The solution to the multicollinearity problem is additional data that are not so highly intercorrelated. There is no firm rule for deciding when the correlations among the explanatory variables are large enough to constitute a "problem." Multicollinearity does not bias any of the estimates. It is more of an explanation for why the estimates are not more precise. In our consumption function example, the estimated equation seems fine (the t values are in brackets): 2