60 likes | 141 Vues
Explore Multiple Regression - similar to simple regression but with multiple independent variables, coefficients, and partial slopes. Learn about collinearity and variable selection methods. Dive into Logistic Regression for qualitative dependent variables and odds ratios.
E N D
Multiple Regression • Similar to simple regression, but with more than one independent variable • R2 has same interpretation • Residual analysis is similar • Confidence & Prediction Interval are similar
Multiple Regression • A multiple regression model includes a coefficient for each independent variable • Simple case is a quadratic model on a single variable • Independent variable can be indicator (dummy) variable • i.e. gender = 0 for female and gender =1 for male • Coefficients are called “partial slopes”
Multiple Regression • A multiple regression model includes a coefficient for each independent variable • Collinearity occurs when two or more independent variables are correlated, thus explain the same information • Model can include interaction terms if independent variables are interact
Variable Selection • Several procedures have been developed for selecting the best model for predicting Y from several independent variables (X’s) • Compare all possible regressions • Backward elimination • Forward Selection • Stepwise Elimination
Logistic Regression • A regression model with a qualitative (typically dichotomous) dependent variable • Dependent variable can be thought of as a binomial response • i.e. Y=1 if patient is cured, and Y=0 otherwise • Model is constructed to predict P(Y=1) using a logistic function
Logistic Regression • Linear relationship between the natural log of the odds ratio and the independent variables. • Odds ratio is the ratio of probabilities of success to failure • Each coefficient describes the size of the contribution of that “risk factor”