1 / 91

Chapter 13 Generalized Linear Models

Chapter 13 Generalized Linear Models. Generalized Linear Models. Traditional applications of linear models, such as DOX and multiple linear regression, assume that the response variable is Normally distributed Constant variance Independent

gunnar
Télécharger la présentation

Chapter 13 Generalized Linear Models

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Chapter 13Generalized Linear Models Linear Regression Analysis 5E Montgomery, Peck & Vining

  2. Generalized Linear Models • Traditional applications of linear models, such as DOX and multiple linear regression, assume that the response variable is • Normally distributed • Constant variance • Independent • There are many situations where these assumptions are inappropriate • The response is either binary (0,1), or a count • The response is continuous, but nonnormal Linear Regression Analysis 5E Montgomery, Peck & Vining

  3. Some Approaches to These Problems • Data transformation • Induce approximate normality • Stabilize variance • Simplify model form • Weighted least squares • Often used to stabilize variance • Generalized linear models (GLM) • Approach is about 25-30 years old, unifies linear and nonlinear regression models • Response distribution is a member of the exponential family (normal, exponential, gamma, binomial, Poisson) Linear Regression Analysis 5E Montgomery, Peck & Vining

  4. Generalized Linear Models • Original applications were in biopharmaceutical sciences • Lots of recent interest in GLMs in industrial statistics • GLMs are simple models; include linear regression and OLS as a special case • Parameter estimation is by maximum likelihood (assume that the response distribution is known) • Inference on parameters is based on large-sample or asymptotic theory • We will consider logistic regression, Poisson regression, then the GLM Linear Regression Analysis 5E Montgomery, Peck & Vining

  5. References • Montgomery, D. C., Peck, E. A5, and Vining, G. G. (2012), Introduction to Linear Regression Analysis, 4th Edition, Wiley, New York (see Chapter 14) • Myers, R. H., Montgomery, D. C., Vining, G. G. and Robinson, T.J. (2010), Generalized Linear Models with Applications in Engineering and the Sciences, 2nd edition, Wiley, New York • Hosmer, D. W. and Lemeshow, S. (2000), Applied Logistic Regression, 2ndEdition, Wiley, New York • Lewis, S. L., Montgomery, D. C., and Myers, R. H. (2001), “Confidence Interval Coverage for Designed Experiments Analyzed with GLMs”, Journal of Quality Technology 33, pp. 279-292 • Lewis, S. L., Montgomery, D. C., and Myers, R. H. (2001), “Examples of Designed Experiments with Nonnormal Responses”, Journal of Quality Technology 33, pp. 265-278 • Myers, R. H. and Montgomery, D. C. (1997), “A Tutorial on Generalized Linear Models”, Journal of Quality Technology 29, pp. 274-291 Linear Regression Analysis 5E Montgomery, Peck & Vining

  6. Binary Response Variables • The outcome ( or response, or endpoint) values 0, 1 can represent “success” and “failure” • Occurs often in the biopharmaceutical field; dose-response studies, bioassays, clinical trials • Industrial applications include failure analysis, fatigue testing, reliability testing • For example, functional electrical testing on a semiconductor can yield: • “success” in which case the device works • “failure” due to a short, an open, or some other failure mode Linear Regression Analysis 5E Montgomery, Peck & Vining

  7. Binary Response Variables • Possible model: • The response yi is a Bernoulli random variable Linear Regression Analysis 5E Montgomery, Peck & Vining

  8. Problems With This Model • The error terms take on only two values, so they can’t possibly be normally distributed • The variance of the observations is a function of the mean (see previous slide) • A linear response function could result in predicted values that fall outside the 0, 1 range, and this is impossible because Linear Regression Analysis 5E Montgomery, Peck & Vining

  9. Binary Response Variables – The Challenger Data Data for space shuttle launches and static tests prior to the launch of Challenger Linear Regression Analysis 5E Montgomery, Peck & Vining

  10. Binary Response Variables • There is a lot of empirical evidence that the response function should be nonlinear; an “S” shape is quite logical • See the scatter plot of the Challenger data • The logistic response function is a common choice Linear Regression Analysis 5E Montgomery, Peck & Vining

  11. Linear Regression Analysis 5E Montgomery, Peck & Vining

  12. The Logistic Response Function • The logistic response function can be easily linearized. Let: • Define • This is called the logit transformation Linear Regression Analysis 5E Montgomery, Peck & Vining

  13. Logistic Regression Model • Model: • The model parameters are estimated by the method of maximum likelihood (MLE) Linear Regression Analysis 5E Montgomery, Peck & Vining

  14. A Logistic Regression Model for the Challenger Data (Using Minitab) Binary Logistic Regression: O-Ring Fail versus Temperature Link Function: Logit Response Information Variable Value Count O-Ring F 1 7 (Event) 0 17 Total 24 Logistic Regression Table Odds 95% CI Predictor Coef SE Coef Z P Ratio Lower Upper Constant 10.875 5.703 1.91 0.057 Temperat -0.17132 0.08344 -2.05 0.040 0.84 0.72 0.99 Log-Likelihood = -11.515 Linear Regression Analysis 5E Montgomery, Peck & Vining

  15. A Logistic Regression Model for the Challenger Data Test that all slopes are zero: G = 5.944, DF = 1, P-Value = 0.015 Goodness-of-Fit Tests Method Chi-Square DF P Pearson 14.049 15 0.522 Deviance 15.759 15 0.398 Hosmer-Lemeshow 11.834 8 0.159 Linear Regression Analysis 5E Montgomery, Peck & Vining

  16. Note that the fitted function has been extended down to 31 deg F, the temperature at which Challenger was launched Linear Regression Analysis 5E Montgomery, Peck & Vining

  17. Maximum Likelihood Estimation in Logistic Regression • The distribution of each observation yi is • The likelihood function is • We usually work with the log-likelihood: Linear Regression Analysis 5E Montgomery, Peck & Vining

  18. Maximum Likelihood Estimation in Logistic Regression • The maximum likelihood estimators (MLEs) of the model parameters are those values that maximize the likelihood (or log-likelihood) function • ML has been around since the first part of the previous century • Often gives estimators that are intuitively pleasing • MLEs have nice properties; unbiased (for large samples), minimum variance (or nearly so), and they have an approximate normal distribution when n is large Linear Regression Analysis 5E Montgomery, Peck & Vining

  19. Maximum Likelihood Estimation in Logistic Regression • If we have ni trials at each observation, we can write the log-likelihood as • The derivative of the log-likelihood is Linear Regression Analysis 5E Montgomery, Peck & Vining

  20. Maximum Likelihood Estimation in Logistic Regression • Setting this last result to zero gives the maximum likelihood score equations • These equations look easy to solve…we’ve actually seen them before in linear regression: Linear Regression Analysis 5E Montgomery, Peck & Vining

  21. Maximum Likelihood Estimation in Logistic Regression • Solving the ML score equations in logistic regression isn’t quite as easy, because • Logistic regression is a nonlinear model • It turns out that the solution is actually fairly easy, and is based on iteratively reweighted least squares or IRLS (see Appendix for details) • An iterative procedure is necessary because parameter estimates must be updated from an initial “guess” through several steps • Weights are necessary because the variance of the observations is not constant • The weights are functions of the unknown parameters Linear Regression Analysis 5E Montgomery, Peck & Vining

  22. Interpretation of the Parameters in Logistic Regression • The log-odds at x is • The log-odds at x + 1 is • The difference in the log-odds is Linear Regression Analysis 5E Montgomery, Peck & Vining

  23. Interpretation of the Parameters in Logistic Regression • The odds ratio is found by taking antilogs: • The odds ratio is interpreted as the estimated increase in the probability of “success” associated with a one-unit increase in the value of the predictor variable Linear Regression Analysis 5E Montgomery, Peck & Vining

  24. Odds Ratio for the Challenger Data This implies that every decrease of one degree in temperature increases the odds of O-ring failure by about 1/0.84 = 1.19 or 19 percent The temperature at Challenger launch was 22 degrees below the lowest observed launch temperature, so now This results in an increase in the odds of failure of 1/0.0231 = 43.34, or about 4200 percent!! There’s a big extrapolation here, but if you knew this prior to launch, what decision would you have made? Linear Regression Analysis 5E Montgomery, Peck & Vining

  25. Inference on the Model Parameters Linear Regression Analysis 5E Montgomery, Peck & Vining

  26. Inference on the Model Parameters See slide 15; Minitab calls this “G”. Linear Regression Analysis 5E Montgomery, Peck & Vining

  27. Testing Goodness of Fit Linear Regression Analysis 5E Montgomery, Peck & Vining

  28. Pearson chi-square goodness-of-fit statistic: Linear Regression Analysis 5E Montgomery, Peck & Vining

  29. The Hosmer-Lemeshow goodness-of-fit statistic: Linear Regression Analysis 5E Montgomery, Peck & Vining

  30. Refer to slide 15 for the Minitab output showing all three goodness-of-fit statistics for the Challenger data Linear Regression Analysis 5E Montgomery, Peck & Vining

  31. Likelihood Inference on the Model Parameters • Deviance can also be used to test hypotheses about subsets of the model parameters (analogous to the extra SS method) • Procedure: Linear Regression Analysis 5E Montgomery, Peck & Vining

  32. Inference on the Model Parameters • Tests on individual model coefficients can also be done using Wald inference • Uses the result that the MLEs have an approximate normal distribution, so the distribution of is standard normal if the true value of the parameter is zero. Some computer programs report the square of Z (which is chi-square), and others calculate the P-value using the t distribution See slide 14 for the Wald test on the temperature parameter for the Challenger data Linear Regression Analysis 5E Montgomery, Peck & Vining

  33. Another Logistic Regression Example: The Pneumoconiosis Data • A 1959 article in Biometrics reported the data: Linear Regression Analysis 5E Montgomery, Peck & Vining

  34. Linear Regression Analysis 5E Montgomery, Peck & Vining

  35. Linear Regression Analysis 5E Montgomery, Peck & Vining

  36. The fitted model: Linear Regression Analysis 5E Montgomery, Peck & Vining

  37. Linear Regression Analysis 5E Montgomery, Peck & Vining

  38. Linear Regression Analysis 5E Montgomery, Peck & Vining

  39. Linear Regression Analysis 5E Montgomery, Peck & Vining

  40. Diagnostic Checking Linear Regression Analysis 5E Montgomery, Peck & Vining

  41. Linear Regression Analysis 5E Montgomery, Peck & Vining

  42. Linear Regression Analysis 5E Montgomery, Peck & Vining

  43. Linear Regression Analysis 5E Montgomery, Peck & Vining

  44. Consider Fitting a More Complex Model Linear Regression Analysis 5E Montgomery, Peck & Vining

  45. A More Complex Model Is the expanded model useful? The Wald test on the term (Years)2 indicates that the term is probably unnecessary. Consider the difference in deviance: Compare the P-values for the Wald and deviance tests Linear Regression Analysis 5E Montgomery, Peck & Vining

  46. Linear Regression Analysis 5E Montgomery, Peck & Vining

  47. Linear Regression Analysis 5E Montgomery, Peck & Vining

  48. Linear Regression Analysis 5E Montgomery, Peck & Vining

  49. Other models for binary response data Logit model Probit model Complimentary log-log model Linear Regression Analysis 5E Montgomery, Peck & Vining

  50. Linear Regression Analysis 5E Montgomery, Peck & Vining

More Related