1 / 69

Simple Linear Regression (SLR) CHE1147 Saed Sayad University of Toronto

Simple Linear Regression (SLR) CHE1147 Saed Sayad University of Toronto. Types of Correlation. Positive correlation. Negative correlation. No correlation. Simple linear regression describes the linear relationship between a predictor variable, plotted on the x -axis, and a

vitalis
Télécharger la présentation

Simple Linear Regression (SLR) CHE1147 Saed Sayad University of Toronto

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Simple Linear Regression (SLR) CHE1147 Saed Sayad University of Toronto

  2. Types of Correlation Positive correlation Negative correlation No correlation

  3. Simple linear regression describes the linear relationship between a predictor variable, plotted on the x-axis, and a response variable, plotted on the y-axis dependent Variable (Y) Independent Variable (X)

  4. Y 1.0 X

  5. Y 1.0 X

  6. Y X

  7. ε Y ε X

  8. Fitting data to a linear model intercept slope residuals

  9. How to fit data to a linear model? The Ordinary Least Square Method (OLS)

  10. Least Squares Regression Model line: Residual (ε) = Sum of squares of residuals = • we must find values of and that minimise

  11. Regression Coefficients

  12. Required Statistics

  13. Descriptive Statistics

  14. Regression Statistics

  15. Variance to be explained by predictors (SST) Y

  16. X1 Variance explained byX1 (SSR) Y Variance NOT explained byX1 (SSE)

  17. Regression Statistics

  18. Regression Statistics Coefficient of Determination to judge the adequacy of the regression model

  19. Regression Statistics Correlation measures the strength of the linear association between two variables.

  20. Regression Statistics Standard Error for the regression model

  21. ANOVA ANOVA to test significance of regression

  22. Hypothesis Tests for Regression Coefficients

  23. Hypotheses Tests for Regression Coefficients

  24. Confidence Interval on Regression Coefficients Confidence Interval for b1

  25. Hypothesis Tests on Regression Coefficients

  26. Confidence Interval on Regression Coefficients Confidence Interval for the intercept

  27. Hypotheses Test the Correlation Coefficient We would reject the null hypothesis if

  28. Diagnostic Tests For Regressions Expected distribution of residuals for a linear model with normal distribution or residuals (errors).

  29. Diagnostic Tests For Regressions Residuals for a non-linear fit

  30. Diagnostic Tests For Regressions Residuals for a quadratic function or polynomial

  31. Diagnostic Tests For Regressions Residuals are not homogeneous (increasing in variance)

  32. Regression – important points • Ensure that the range of values • sampled for the predictor variable • is large enough to capture the full • range to responses by the response • variable.

  33. Y X Y X

  34. Regression – important points 2. Ensure that the distribution of predictor values is approximately uniform within the sampled range.

  35. Y X Y X

  36. Assumptions of Regression 1. The linear model correctly describes the functional relationship between X and Y.

  37. Assumptions of Regression 1. The linear model correctly describes the functional relationship between X and Y. Y X

  38. Assumptions of Regression 2. The X variable is measured without error Y X

  39. Assumptions of Regression 3. For any given value of X, the sampled Y values are independent 4. Residuals (errors) are normally distributed. 5. Variances are constant along the regression line.

  40. Multiple Linear Regression (MLR)

  41. The linear model with a single predictor variable X can easily be extended to two or more predictor variables.

  42. Common variance explained byX1 and X2 Unique variance explained by X2 X2 X1 Y Unique variance explained byX1 Variance NOT explained byX1and X2

  43. A “good” model X1 X2 Y

  44. Partial Regression Coefficients intercept residuals Partial Regression Coefficients (slopes):Regression coefficient of X after controlling for (holding all other predictors constant) influence of other variables from both X and Y.

  45. The matrix algebra of Ordinary Least Square Intercept and Slopes: Predicted Values: Residuals:

  46. Regression Statistics How good is our model?

  47. Regression Statistics Coefficient of Determination to judge the adequacy of the regression model

  48. Regression Statistics n = sample size k = number of independent variables Adjusted R2 are not biased!

  49. Regression Statistics Standard Error for the regression model

  50. ANOVA at least one! ANOVA to test significance of regression

More Related