1 / 42

Chapter 3: Introductory Linear Regression

Chapter 3: Introductory Linear Regression. Chapter Outline 3.1 Simple Linear Regression Scatter Plot/Diagram Simple Linear Regression Model 3.2 Curve Fitting 3.3 Inferences About Estimated Parameters 3.4 Adequacy of the model coefficient of determination

ina-carey
Télécharger la présentation

Chapter 3: Introductory Linear Regression

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Chapter 3: Introductory Linear Regression Chapter Outline 3.1 Simple Linear Regression Scatter Plot/Diagram Simple Linear Regression Model 3.2 Curve Fitting 3.3 Inferences About Estimated Parameters 3.4 Adequacy of the model coefficient of determination 3.5 Pearson Product Moment Correlation Coefficient 3.6 Test for Linearity of Regression 3.7 ANOVA Approach Testing for Linearity of Regression

  2. INTRODUCTION TO LINEAR REGRESSION • Regression – is a statistical procedure for establishing the r/ship between 2 or more variables. • This is done by fitting a linear equation to the observed data. • The regression line is used by the researcher to see the trend and make prediction of values for the data. • There are 2 types of relationship: • Simple ( 2 variables) • Multiple (more than 2 variables)

  3. INTRODUCTION TO LINEAR REGRESSION Many problems in science and engineering involve exploring the relationship between two or more variables. Two statistical techniques: Regression Analysis Computing the Correlation Coefficient (r). Linear regression - study on the linear relationship between two or more variables. This is done by fitting a linear equation to the observed data. The linear equation is then used to predict values for the data.

  4. In simple linear regression only two variables are involved: X is the independent variable. Y is dependent variable. The correlation coefficient (r ) tells us how strongly two variables are related.

  5. Example 3.1: 1) A nutritionist studying weight loss programs might wants to find out if reducing intake of carbohydrate can help a person reduce weight. a) X is the carbohydrate intake (independent variable). b) Y is the weight (dependent variable). 2) An entrepreneur might want to know whether increasing the cost of packaging his new product will have an effect on the sales volume. a) X is cost b) Y is sales volume

  6. 3.1 SIMPLE LINEAR REGRESSION MODEL Linear regression model is a model that expresses the linear relationship between two variables. The simple linear regression model is written as: where ; Random error is the difference of data point from the deterministic value.

  7. 3.2 CURVE FITTING (SCATTER PLOT) Scatter plot Scatter plots show the relationship between two variables by displaying data points on a two-dimensional graph. The variable that might be considered as an explanatory variable is plotted on the x-axis, and the response variable is plotted on the y- axis. Scatter plots are especially useful when there are a large number of data points.

  8. They provide the following information about the relationship between two variables: (1) Strength (2) Shape - linear, curved, etc. (3) Direction - positive or negative (4) Presence of outliers

  9. Examples:

  10. PLOTTING LINEAR REGRESSION MODEL • A linear regression can be develop by freehand plot of the data. Example 3.2: The given table contains values for 2 variables, X and Y. Plot the given data and make a freehand estimated regression line.

  11. 3.3 INFERENCES ABOUT ESTIMATED PARAMETERS LEAST SQUARES METHOD The Least Square method is the method most commonly used for estimating the regression coefficients The straight line fitted to the data set is the line: where is the estimated value of y for a given value of X.

  12. y-Intercept for the Estimated Regression Equation,

  13. ii) Slope for the Estimated Regression Equation,

  14. Example 3.3: Students Score In History The data below represent scores obtained by ten primary school students before and after they were taken on a tour to the museum (which is supposed to increase their interest in history) • Develop a linear regression model with “before” as the independent variable and “after” as the dependent variable. • Predict the score a student would obtain “after” if he scored 60 marks “before”.

  15. Exercise 3.1: • a) Fit a linear regression model with income as the independent variable and food expenditure as the dependent variable. • Predict the food expenditure if income is 50.

  16. Exercise 3.2:

  17. 3.4 ADEQUACY OF THE MODEL COEFFICIENT OF DETERMINATION( ) • The coefficient of determination is a measure of the variation of the dependent variable (Y) that is explained by the regression line and the independent variable (X). • The symbol for the coefficient of determination is or . • If =0.90, then =0.81. It means that 81% of the variation in the dependent variable (Y) is accounted for by the variations in the independent variable (X).

  18. The rest of the variation, 0.19 or 19%, is unexplained and called the coefficient of non determination. • Formula for the coefficient of non determination is

  19. SST = SSR + SSE • Relationship Among SST, SSR, SSE where: SST = total sum of squares SSR = sum of squares due to regression SSE = sum of squares due to error • The coefficient of determination is: where: SSR = sum of squares due to regression SST = total sum of squares

  20. 3.5 PEARSON PRODUCT MOMENT CORRELATION COEFFICIENT (r) • Correlation measures the strength of a linear relationship between the two variables. • Also known as Pearson’s product moment coefficient of correlation. • The symbol for the sample coefficient of correlation • is (r) • Formula : • or

  21. Properties of (r): • Values of r close to 1 implies there is a strong positive linear relationship between x and y. • Values of rclose to -1 implies there is a strong negative linear relationship between x and y. • Values of rclose to 0 implies little or no linear relationship between x and y.

  22. Assumptions About the Error Term e 1. The error  is a random variable with mean of zero. 2. The variance of  , denoted by  2, is the same for all values of the independent variable. 3. The values of  are independent. 4. The error  is a normally distributed random variable.

  23. Example 3.4: Refer Previous Example 1, Students Score in History Calculate the value of r and interpret its meaning. Solution: Thus, there is a strong positive linear relationship between score obtain before (x) and after (y).

  24. Exercise 3.3: Refer to previous Exercise 3.1 and Exercise 3.2, calculate coefficient correlation and interpret the results.

  25. 3.6 TEST FOR LINEARITY OF REGRESSION • To test the existence of a linear relationship between two variables x and y, we proceed with testing the hypothesis. • Two test are commonly used: (i) (ii) t -Test F -Test

  26. (i) t-Test 1. Determine the hypotheses. ( no linear r/ship) (exist linear r/ship) 2. Compute Critical Value/ level of significance. 3. Compute the test statistic.

  27. 4. Determine the Rejection Rule. Reject H0 if : p-value <a 5.Conclusion. There is a significant relationship between variable X and Y.

  28. Example 3.5: Refer Previous Example 1, Students Score in History Test to determine if their scores before and after the trip is related. Use a=0.05 Solution: 1) 2) ( no linear r/ship) (exist linear r/ship)

  29. 3) 4) Rejection Rule: 5) Conclusion: Thus, we reject H0. The score before (x) is linear relationship to the score after (y) the trip.

  30. Exercise 3.4:

  31. Exercise 3.5:

  32. (ii) F-Test 1. Determine the hypotheses. ( no linear r/ship) (exist linear r/ship) 2. Specify the level of significance. 3. Compute the test statistic. F = MSR/MSE 4. Determine the Rejection Rule. Reject H0 if : p-value < a F test >

  33. 5.Conclusion. There is a significant relationship between variable X and Y.

  34. 3.7 ANOVA APPROACH FOR TESTING LINEARITY OF REGRESSION • The analysis of variance (ANOVA) method is an approach to test the significance of the regression. • We can arrange the test procedure using this approach in an ANOVA table as shown below;

  35. Example 3.6: • The manufacturer of Cardio Glide exercise equipment wants to study the relationship between the number of months since the glide was purchased and the length of time (hours) the equipment was used last week. At , test whether there is a linear relationship between the variables.

  36. Solution: • Hypothesis: • F-distribution table: • Test Statistic: F = MSR/MSE = 17.303 or using p-value approach: significant value =0.003 • Rejection region: Since F statistic > F table (17.303>11.2586 ), we reject H0 or since p-value (0.003 0.01 ) we reject H0 5) Thus, there is a linear relationship between the variables (month X and hours Y).

  37. EXERCISE 3.6: An agricultural scientist planted alfalfa on several plots of land, identical except for the soil pH. Following are the dry matter yields (in pounds per acre) for each plot.

  38. Construct a scatter plot of yield (y) versus pH (x). Verify that a linear model is appropriate. • Compute the estimated regression line for predicting Yield from pH. • If the pH is increased by 0.1, by how much would you predict the yield to increase or decrease? • For what pH would you predict a yield of 1500 pounds per acre? • Calculate coefficient correlation, and interpret the results. Answer :

  39. EXERCISE 3.7 A regression analysis relating the current market value in dollars to the size in square feet of homes in Greeny County, Tennessee, follows. The portion of a regression software output as below: • Determine how many states in the sample. • Determine the regression equation. • Can you conclude that there a linear relationship between the variables at ? Predictor Coef SE Coef T P Constant 12.726 8.115 1.57 0.134 Size 0.00011386 0.00002896 3.93 0.001 Analysis of Variance Source DF SS MS F P Regression 1 10354 10354 15.46 0.001 Error 18 12054 670 Total 19 22408

More Related