1 / 12

Assumptions of Regression Analysis

Assumptions of Regression Analysis. The independent variables do not form a linearly dependent set--i.e. the explanatory variables are not perfectly correlated.

Leo
Télécharger la présentation

Assumptions of Regression Analysis

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Assumptions of Regression Analysis • The independent variables do not form a linearly dependent set--i.e. the explanatory variables are not perfectly correlated. • Homoscedasticity--the probability distributions of the error term have a constant variance for all values of the independent variables (Xi's).

  2. Perfect multicollinearity is a violation of assumption (1).Heteroscedasticity is a violation of assumption (2)

  3. Multicollinearity is a problem with time series regression Suppose we wanted to estimate the following specification using quarterly time series data: Auto Salest = 0 + 1Incomet + 2Pricest where Incomet is (nominal) income in quarter t and Pricest is an index of auto prices in quarter t. The data reveal there is a strong(positive) correlation betweennominal income and car prices

  4. Approximate linear relationship between explanatory variables Car prices 0 (Nominal) income

  5. Why is multicollinearity a problem? • In the case of perfectly collinear explanatory variables, OLS does not work. • In the case where there is an approximate linear relationship among the explanatory variables (Xi’s), the estimates of the coefficients are still unbiased, but you run into the following problems: • High standard errors of the estimates of the coefficients—thus low t-ratios • Co-mingling of the effects of explanatory variables. • Estimates of the coefficients tends to be “unstable.”

  6. What do about multicollinearity • Increase sample size • Delete one or more explanatory variables

  7. Understanding heteroscedasticity This problem pops up when using cross sectional data

  8. Consider the following model: Yi is the “determined” part of the equation and εi is the error term. Remember we assume in regression that :E(εi) =0

  9. JAR #1 JAR #2 4 400 -4 -400 0 0 -2 -200 200 2  = 0  = 0 Two distributions with the same mean and different variances

  10. The disturbance distributions of heteroscedasticity f(x) Y 0 X1 X2 X2 X

  11. Scatter diagram of ascending heteroscedasticity Spending for electronics Household Income

  12. Why is heteroscedasticity a problem? • Heteroscedasticity does not give us biased estimates of the coefficients--however, it does make the standard errors of the estimates unreliable. That is, we will understate the standard errors. • Due to the aforementioned problem, t-tests cannot be trusted. We run the risk of rejecting a null hypothesis that should not be rejected.

More Related