1 / 27

Objectives of Multiple Regression

Objectives of Multiple Regression. Establish the linear equation that best predicts values of a dependent variable Y using more than one explanatory variable from a large set of potential predictors {x 1 , x 2 , ... x k }.

ally
Télécharger la présentation

Objectives of Multiple Regression

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Objectives of Multiple Regression • Establish the linear equation that best predicts values of a dependent variable Y using more than one explanatory variable from a large set of potential predictors {x1, x2, ... xk}. • Find that subset of all possible predictor variables that explains a significant and appreciable proportion of the variance of Y, trading off adequacy of prediction against the cost of measuring more predictor variables.

  2. General polynomial model. Y = b0 + b1x1 + b2x12 + b3x13 + ... + bkx1k + e Expanding Simple Linear Regression • Quadratic model. Adding one or more polynomial terms to the model. Y = b0 + b1x1 + b2x12 + e Any independent variable, xi, which appears in the polynomial regression model as xik is called a kth-degree term.

  3. Polynomial model shapes. Linear Adding one more terms to the model significantly improves the model fit. Quadratic

  4. Incorporating Additional Predictors Simple additive multiple regression model y = b0 + b1x1 + b2x2 + b3x3 + ... + bkxk + e Additive (Effect) Assumption - The expected change in y per unit increment in xj is constant and does not depend on the value of any other predictor. This change in y is equal to j.

  5. Additive regression models: For two independent variables, the response is modeled as a surface.

  6. Interpreting Parameter Values (Model Coefficients) • “Intercept” - value of y when all predictors are 0. b0 • “Partial slopes” b1, b2, b3, ... bk bj - describes the expected changein y per unit increment in xjwhen all other predictors in the model are held at a constant value.

  7. Graphical depiction of bj. b1- slope in direction of x1. b2 - slope in direction of x2.

  8. Multiple Regression with Interaction Terms Y = b0 + b1x1 + b2x2 + b3x3 + ... + bkxk + b12x1x2 + b13x1x3 + ... + b1kx1xk + ... + bk-1,kxk-1xk + e cross-product terms quantify the interaction among predictors. Interactive (Effect) Assumption: The effect of one predictor, xi, on the response, y, will depend on the value of one or more of the other predictors.

  9. Interpreting Interaction Interaction Model b1 – No longer the expected change in Y per unit increment in X1! b12– No easy interpretation! The effect on y of a unit increment in X1, now depends on X2. No difference or Define:

  10. x2=2 no-interaction x2=1 y } b2 x2=0 } b2 b1 b0 x1 x2=0 interaction y b1 b0+2b2 x2=1 b0+b2 b1+2b12 b0 x2=2 x1

  11. Lines move apart Lines come together Multiple Regression models with interaction:

  12. Effect of the Interaction Term in Multiple Regression Surface is twisted.

  13. A Protocol for Multiple Regression Identify all possible predictors. Establish a method for estimating model parameters and their standard errors. Develop tests to determine if a parameter is equal to zero (i.e. no evidence of association). Reduce number of predictors appropriately. Develop predictions and associated standard error.

  14. Estimating Model ParametersLeast Squares Estimation Assuming a random sample of n observations (yi, xi1,xi2,...,xik), i=1,2,...,n. The estimates of the parameters for the best predicting equation: Is found by choosing the values: which minimize the expression:

  15. Normal Equations Take the partial derivatives of the SSE function with respect to 0, 1,…, k, and equate each equation to 0. Solve this system of k+1 equations in k+1 unknowns to obtain the equations for the parameter estimates.

  16. An Overall Measure of How Well the Full Model Performs • Denoted as R2. • Defined as the proportion of the variability in the dependent variable y that is accounted for by the independent variables, x1, x2, ..., xk, through the regression model. • With only one independent variable (k=1), R2 = r2, the square of the simple correlation coefficient. Coefficient of Multiple Determination

  17. Computing the Coefficient of Determination

  18. Multicollinearity A further assumption in multiple regression (absent in SLR), is that the predictors (x1, x2, ... xk) are statistically uncorrelated. That is, the predictors do not co-vary. When the predictors are significantly correlated (correlation greater than about 0.6) then the multiple regression model is said to suffer from problems of multicollinearity. r = 0 r = 0.8 r = 0.6

  19. x x x x x x x x x x x x x x x x x x x x x x x x x x Effect of Multicollinearity on the Fitted Surface Extreme collinearity y x2 x1

  20. Multicollinearity leads to • Numerical instability in the estimates of the regression parameters – wild fluctuations in these estimates if a few observations are added or removed. • No longer have simple interpretations for the regression coefficients in the additive model. • Ways to detect multicollinearity • Scatterplots of the predictor variables. • Correlation matrix for the predictor variables – the higher these correlations the worse the problem. • Variance Inflation Factors (VIFs) reported by software packages. Values larger than 10 usually signal a substantial amount of collinearity. • What can be done about multicollinearity • Regression estimates are still OK, but the resulting confidence/prediction intervals are very wide. • Choose explanatory variables wisely! (E.g. consider omitting one of two highly correlated variables.) • More advanced solutions: principal components analysis; ridge regression.

  21. Testing in Multiple Regression • Testing individual parameters in the model. • Computing predicted values and associated standard errors. Overall AOV F-test H0: None of the explanatory variables is a significant predictor of Y Reject if:

  22. Standard Error for Partial Slope Estimate The estimated standard error for: where and is the coefficient of determination for the model with xj as the dependent variable and all other x variables as predictors. What happens if all the predictors are truly independent of each other? If there is high dependency?

  23. Confidence Interval 100(1-a)% Confidence Interval for df for SSE Reflects the number of data points minus the number of parameters that have to be estimated.

  24. Testing whether a partial slope coefficient is equal to zero. Rejection Region: Alternatives: Test Statistic:

  25. Predicting Y • We use the least squares fitted value, , as our predictor of a single value of y at a particular value of the explanatory variables (x1, x2, ..., xk). • The corresponding interval about the predicted value of y is called a prediction interval. • The least squares fitted value also provides the best predictor of E(y), the mean value of y, at a particular value of (x1, x2, ..., xk). The corresponding interval for the mean prediction is called a confidence interval. • Formulas for these intervals are much more complicated than in the case of SLR; they cannot be calculated by hand (see the book).

  26. Minimum R2 for a “Significant” Regression Since we have formulas for R2 and F, in terms of n, k, SSE and TSS, we can relate these two quantities. We can then ask the question: what is the min R2 which will ensure the regression model will be declared significant, as measured by the appropriate quantile from the F distribution? The answer (below), shows that this depends on n, k, and SSE/TSS.

  27. Minimum R2 for Simple Linear Regression (k=1)

More Related