110 likes | 245 Vues
Multicollinearity occurs when two or more independent variables in a regression model are highly correlated, leading to inflated standard errors of Ordinary Least Squares (OLS) parameter estimates. This can result in independent variables appearing statistically insignificant despite their critical role in explaining the dependent variable. Symptoms include a high R-squared value with few significant t-tests. Detection methods include variance inflation factors (VIF), with values exceeding 10 indicating significant multicollinearity. Remedies may involve excluding problematic variables or modifying the model specification.
E N D
AAEC 4302ADVANCED STATISTICAL METHODS IN AGRICULTURAL RESEARCH Chapter 13.3 Multicollinearity
Multicollinearity • Multicollinearity occurs when two or more independent variables in a regression model are highly correlated to each other • Standard error of the OLS parameter estimate will be higher if the corresponding independent variable is more highly correlated to the other independent variables in the model
Multicollinearity • Independent variables show no statistical significance when conducting the basic significance test • It is not a mistake in the model specification, but due to the nature of the data at hand
Perfect Multicollinearity • Perfect multicollinearity occurs when there is a perfect linear correlation between two or more independent variables • When independent variable takes a constant value in all observations
Severe Multicollinearity • The OLS method cannot produce parameter estimates • A certain degree of correlation (multicollinearity) between the independent variables is normal and expected in most cases • Severe multicollinearity
Symptoms of Multicollinearity • The symptoms of a multicollinearity problem • independent variable(s) considered critical in explaining the model’s dependent variable are not statistically significant according to the tests
Symptoms of Multicollinearity • High R2, highly significant F-test, but few or no statistically significant t tests • Parameter estimates drastically change values and become statistically significant when excluding some independent variables from the regression
Detecting Multicollinearity • A simple test for multicollinearity is to conduct “artificial” regressions between each independent variable (as the “dependent” variable) and the remaining independent variables • Variance Inflation Factors (VIFj) are calculated as:
Detecting Multicollinearity • VIFj = 2, for example, means that variance is twice what it would be if Xj, was not affected by multicollinearity • A VIFj>10 is clear evidence that the estimation of Bj is being affected by multicollinearity
Addressing Multicollinearity • Although it is useful to be aware of the presence of multicollinearity, it is not easy to remedy severe (non-perfect) multicollinearity • If possible, adding observations or taking a new sample might help lessen multicollinearity
Addressing Multicollinearity • Exclude the independent variables that appear to be causing the problem • Modifying the model specification sometimes help, for example: • using real instead of nominal economic data • using a reciprocal instead of a polynomial specification on a given independent variable