1 / 63

Probability & Statistical Inference Lecture 9

Probability & Statistical Inference Lecture 9. MSc in Computing (Data Analytics). Lecture Outline. Simple Linear Regression Multiple Regression. AVOVA vs Simple Linear Regression. AVOVA vs Simple Linear Regression. Scatter Plot.

von
Télécharger la présentation

Probability & Statistical Inference Lecture 9

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Probability & Statistical Inference Lecture 9 MSc in Computing (Data Analytics)

  2. Lecture Outline • Simple Linear Regression • Multiple Regression

  3. AVOVA vs Simple Linear Regression

  4. AVOVA vs Simple Linear Regression

  5. Scatter Plot A scatter plot or scattergraph is a type of chart using Cartesian coordinates to display values for two continuous variables for a set of data

  6. Describe Linear Relationship • Correlation – You can quantify the relationship between two variables with correlation statistics. Two variables are correlated if there is a linear relationship between them. • You can classify correlated variables according to the type of correlation: • Positive: One variable tends to increase in value as the other increases in value. • Negative: One variable tends to decrease in value as the other increases in value. • Zero: No linear relationship between the two variables (uncorrelated)

  7. Pearson Correlation Coefficient

  8. Caution using correlation Four sets of data with the same correlation of 0.816

  9. Demo

  10. Regression Analysis Introduction • Many problems in engineering and science involve exploring the relationships between two or more variables. • Regression analysisis a statistical technique that is very useful for these types of problems. • For example, in a chemical process, suppose that the yield of the product is related to the process-operating temperature. • Regression analysis can be used to build a model to predict yield at a given temperature level.

  11. Example

  12. Scatter Plot

  13. Regression Model • Based on the scatter diagram, it is probably reasonable to assume that the random variable Y is related to x by a straight-line relationship. We use the equation of a line to model the relationship. The simple linear regression model is given by: where the slope and intercept of the line are called regression coefficientsand where  is the random error term.

  14. Regression Model β1 One unit change in x

  15. Regression Model The true regression model is a line of mean values: where 1 can be interpreted as the change in the mean of Y for a unit change in x. Also, the variability of Y at a particular value of x is determined by the error variance, 2. This implies there is a distribution of Y-values at each x and that the variance of this distribution is the same at each x.

  16. Regression Model

  17. Simple Linear Regression • The case of simple linear regression considers a single regressoror predictorx and a dependent or response variableY. • The expected value of Y at each level of x is a random variable: • We assume that each observation, Y, can be described by the model

  18. Simple Linear Regression Suppose that we have n pairs of observations (x1, y1), (x2, y2), …, (xn, yn). Deviations of the data from the estimated regression model.

  19. Simple Linear Regression The method of least squares is used to estimate the parameters, 0 and 1 by minimizing the sum of the squares of the vertical deviations in diagram below Deviations of the data from the estimated regression model.

  20. Least Squares Estimator

  21. Model Estimates

  22. Notation

  23. Example

  24. Example

  25. Example Scatter plot of oxygen purity y versus hydrocarbon level x and regression model ŷ = 74.20 + 14.97x.

  26. Demo

  27. Model Assumptions • Fitting a regression model requires several assumptions. • Errors are uncorrelated random variables with mean zero; • Errors have constant variance; and, • Errors be normally distributed. • The analyst should always consider the validity of these assumptions to be doubtful and conduct analyses to examine the adequacy of the model

  28. Testing Assumptions – Residual Analysis • The residuals from a regression model are ei = yi - ŷi , where yiis an actual observation and ŷi is the corresponding fitted value from the regression model. • Analysis of the residuals is frequently helpful in checking the assumption that the errors are approximately normally distributed with constant variance, and in determining whether additional terms in the model would be useful.

  29. Residual Analysis Patterns for residual plots. (a) satisfactory, (b) funnel, (c) double bow, (d) nonlinear. [Adapted from Montgomery, Peck, and Vining (2001).]

  30. Example - Residual Analysis

  31. Example - Residual Analysis Normal probability plot of residuals

  32. Example - Residual Analysis Plot of residuals versus predicted oxygen purity, ŷ

  33. Adequacy of the Regression Model • The quantity • is called the coefficient of determination and is often used to judge the adequacy of a regression model. • 0  R2  1; • We often refer (loosely) to R2 as the amount of variability in the data explained or accounted for by the regression model.

  34. Adequacy of the Regression Model • For the oxygen purity regression model, • R2 = SSR/SST • = 152.13/173.38 • = 0.877 • Thus, the model accounts for 87.7% of the variability in the data.

  35. Multiple Linear Regression

  36. Introduction • Many applications of regression analysis involve situations in which there are more than one regressor variable. • A regression model that contains more than one regressor variable is called a multiple regression model.

  37. Introduction • For example, suppose that the effective life of a cutting tool depends on the cutting speed and the tool angle. A possible multiple regression model could be: where: Y – tool life x1– cutting speed x2– tool angle

  38. Introduction The regression plane for the model:E(Y) = 50 + 10x1 + 7x2 The contour plot

  39. Introduction

  40. Demo

  41. Regression & Variable Selection • How do we select the best variable for use in a regression model • Perform a search to see which variable are the most effective • Three search schemes: • Forward sequential selection • Backward sequential selection • Stepwise sequential selection

  42. Sequential Selection – Forward Input p-value Entry Cutoff

  43. Sequential Selection – Forward Input p-value Entry Cutoff

  44. Sequential Selection – Forward Input p-value Entry Cutoff

  45. Sequential Selection – Forward Input p-value Entry Cutoff

  46. Sequential Selection – Backward Input p-value Stay Cutoff

  47. Stay Cutoff Sequential Selection – Backward Input p-value

  48. Stay Cutoff Sequential Selection – Backward Input p-value

  49. Stay Cutoff Sequential Selection – Backward Input p-value

  50. Stay Cutoff Sequential Selection – Backward Input p-value

More Related