1 / 33

Regression Models

Population Deterministic Regression Model Y i =  0 +  1 X i Y i only depends on the value of X i and no other factor can affect Y i . Population Probabilistic Regression Model Y i =  0 +  1 X i +  i , i = 1, 2, ... , n. E(Y |X i )=  0 +  1 X i ,

Télécharger la présentation

Regression Models

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Population Deterministic Regression Model Yi = 0 + 1Xi • Yi only depends on the value of Xi and no other factor can affect Yi . • Population Probabilistic Regression Model Yi = 0 + 1Xi + i, i = 1, 2, ... , n. • E(Y |Xi)= 0 + 1Xi , That is, • Yij = E(Y |Xi) + ij • =0 + 1Xij + ij, i = 1, 2, ... , n; j = 1, 2, ... , N. • 0 and 1 are population parameters • 0 and 1 are estimated by sample statistics b0 and b1 • Sample Model: Regression Models

  2. Assumptions Underlying Linear Regression– for Y • For each value of X, there is a group of Y values, and these Y values are normally distributed. • The means of these normal distributions of Y values all lie on the straight line of regression. • The error variances of these normal distributions are equal (Homoscedasticity). If the error variances are not constant ( called heteroscedasticity). • The Y values are statistically independent. This means that in the selection of a sample, the Y values chosen for a particular X value do not depend on the Y values for any other X values.

  3. Equation of the Simple Regression Line

  4. Ordinary Least Squares (OLS) Analysis

  5. Least Squares Analysis

  6. Sum of Squares Error Standard Error of the Estimate Standard Error of the Estimate

  7. Sum of Squares Error Standard Error of the Estimate Proof: Standard Error of the Estimate

  8. 12-10 Coefficient of Determination • The Coefficient of Determination, r2 - the proportion of the total variation in the dependent variable Y that is explained or accounted for by the variation in the independent variable X. • The coefficient of determination is the square of the coefficient of correlation, and ranges from 0 to 1.

  9. Analysis of Variance (ANOVA)

  10. Figure: Measures of variation in regression

  11. Expectation of b1

  12. Variance of b1

  13. Expectation of b0

  14. Variance of b0

  15. Covariance of b0 and b1

  16. 12-20 p.483 Confidence Interval—predict • The confidence interval for the mean value of Y for a given value of X is given by:

  17. Prediction of Y0

  18. 12-21 p.484 Prediction Interval of an individual value of Y0 • The prediction interval for an individual value of Y for a given value of X is given by:

  19. Confidence Intervals for YX Y Confidence Intervals for E(YX) X=6.5 Figure: Confidence Intervals for Estimation

  20. The Coefficient of Correlation, r • The Coefficient of Correlation(r) is a measure of the strength of the relationship between two variables. • It requires interval or ratio-scaled data (variables). • It can range from -1.00 to 1.00. • Values of -1.00 or 1.00 indicate perfect and strong correlation. • Values close to 0.0 indicate weak correlation. • Negative values indicate an inverse relationship and positive values indicate a direct relationship.

  21. p.489 For sample For population (Pearson Product-Moment ) Correlation Coefficient

  22. p. 493 Covariance

  23. Coefficient of regression and correlation

  24. F and t statistics

  25. The Simple Regression Model-Matrix Denote

More Related