1 / 39

OUTLIER, HETEROSKEDASTICITY,AND NORMALITY

OUTLIER, HETEROSKEDASTICITY,AND NORMALITY. Robust Regression HAC Estimate of Standard Error Quantile Regression. Robust regression analysis. alternative to a least squares regression model when fundamental assumptions are unfulfilled by the nature of the data

asher
Télécharger la présentation

OUTLIER, HETEROSKEDASTICITY,AND NORMALITY

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. OUTLIER, HETEROSKEDASTICITY,AND NORMALITY Robust Regression HAC Estimate of Standard Error Quantile Regression

  2. Robust regression analysis • alternative to a least squares regressionmodel when fundamental assumptions are unfulfilled by the nature of the data • resistant to the influence of outliers • deal with residual problems • Stata & E-Views

  3. Alternatives of OLS • A. White’s Standard Errors OLS with HAC Estimate of Standard Error • B. Weighted Least Squares Robust Regression • C. Quantile Regression Median Regression Bootstrapping

  4. OLS and Heteroskedasticity • What are the implications of heteroskedasticity for OLS? • Under the Gauss–Markov assumptions (including homoskedasticity), OLS was the Best Linear Unbiased Estimator. • Under heteroskedasticity, is OLS still Unbiased? • Is OLS still Best?

  5. A.Heteroskedasticity and Autocorrelation Consistent Variance Estimation • the robust White variance estimator rendered regression resistant to the heteroskedasticityproblem. • Harold White in 1980 showed that for asymptotic (large sample) estimation, the sample sum of squared error corrections approximated those of their population parameters under conditions of heteroskedasticity • and yielded a heteroskedasticallyconsistent sample variance estimate of the standard errors

  6. Quantile Regression • Problem • The distribution of Y, the “dependent” variable, conditional on the covariate X, may have thick tails. • The conditional distribution of Y may be asymmetric. • The conditional distribution of Y may not be unimodal. Neither regression nor ANOVA will give us robust results. Outliers are problematic, the mean is pulled toward the skewed tail, multiple modes will not be revealed.

  7. Reasons to use quantiles rather than means • Analysis of distribution rather than average • Robustness • Skewed data • Interested in representative value • Interested in tails of distribution • Unequal variation of samples • E.g. Income distribution is highly skewed so median relates more to typical person that mean.

  8. Quantiles • Cumulative Distribution Function • Quantile Function • Discrete step function

  9. Regression Line

  10. The Perspective of Quantile Regression (QR)

  11. Optimality Criteria • Linear absolute loss • Mean optimizes • Quantile τ optimizes • I = 0,1 indicator function

  12. Quantile RegressionAbsolute Loss vs. Quadratic Loss

  13. Simple Linear Regression Food Expenditure vs Income Engel 1857 survey of 235 Belgian households Range of Quantiles Change of slope at different quantiles?

  14. Bootstrapping • When distributional normality and homoskedasticity assumptions are violated, many researchers resort to nonparametric bootstrapping methods

  15. Bootstrap Confidence Limits

More Related