1 / 21

Multiple linear regression MLR

Multiple linear regression MLR. Assumptions. The model is linear in the parameters The error terms are statistically independent The independent variables are linearly independent. All populations have equal variances. Linear in parameters. Not linear in parameters.

brier
Télécharger la présentation

Multiple linear regression MLR

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Multiple linear regressionMLR

  2. Assumptions • The model is linear in the parameters • The error terms are statistically independent • The independent variables are linearly independent. • All populations have equal variances

  3. Linear in parameters

  4. Not linear in parameters • Violation of assumptions

  5. Linear in parameter but nonlinear in variables • Not a violation of assumptions

  6. Violation of the assumption that the model is linear in parameters • This is called a mis—specification error. • This means the model has been written improperly. • There is such a thing as non—linear regression

  7. The error terms are statistically independent • If the errors terms are statistically independent, then the value of the error term at time t will not be correlated with the values of the error terms at any other time period. • The ACF of statistically independent error terms will be the ACF of noise.

  8. The error terms are statistically independent • The violation of this assumption is called serial correlation. • Serial correlation can be detected using the Durbin—Watson test or the ACF of the residuals. • Look at plot of residuals

  9. Causes of serial correlation • Omitting a relevant variable from a regression equation. • A mis—specification error.

  10. Consequences of serial correlation • The estimates of the standard deviation of the regressions coefficients ( ) will be wrong • So the T-test and p—values will be wrong as well

  11. Consequences for forecasting • Can be very severe

  12. Fixes for serial correlation • Find the missing relevant variable. • Write the regression equation correctly to avoid mis—specification. • Lagged dependent variable

  13. The independent variables are linearly independent • The independent variables are not linearly independent if you can write one of them as some linear combination of the others.

  14. Perfect multicolinearity

  15. Near multicolinearity

  16. Detection of multicolinearity • Plot the independent variables. • Compute the correlations between the independent variables. • Look for logical inconsistencies in the regression statistics.

  17. Fix for multicolinearity • Find this and you will become very famous amongst econometricians. • Can’t really omit one of the offending variables. • At times it doesn’t really matter for forecasting. (St. Louis Model)

  18. Heteroscedasticity • Violation of the assumption that the populations that the samples come from all have the same variance.

  19. Consequences of heteroscedasticiy • Same as with serial correlation as far as the estimates of is concerned • Makes the forecasts increasingly uncertain

  20. Fix for heteroscedasticity • In this course we will take logs of the dependent variables and perhaps the logs of all variables • More sophisticated methods exist but are difficult to use and also require a good deal of work

More Related