1 / 66

Paul Luo Li (Carnegie Mellon University) James Herbsleb (Carnegie Mellon University)

Experiences and Results from Initiating Field Defect Prediction and Product Test Prioritization Efforts at ABB Inc. Paul Luo Li (Carnegie Mellon University) James Herbsleb (Carnegie Mellon University) Mary Shaw (Carnegie Mellon University) Brian Robinson (ABB Research).

aron
Télécharger la présentation

Paul Luo Li (Carnegie Mellon University) James Herbsleb (Carnegie Mellon University)

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Experiences and Results from Initiating Field Defect Prediction and Product Test Prioritization Efforts at ABB Inc. Paul Luo Li (Carnegie Mellon University) James Herbsleb (Carnegie Mellon University) Mary Shaw (Carnegie Mellon University) Brian Robinson (ABB Research)

  2. Field defects matter Molson, Montreal, Canada Heineken Spain, Madrid, Spain Citizen Thermal Energy, Indianapolis, Indiana, USA Shajiao Power Station, Guandong, China

  3. ABB’s risk mitigation activities include… • Remove potential field defects by focusing systems/integration testing • Higher quality product • Earlier defect detection (when its cheaper to fix)

  4. ABB’s risk mitigation activities include… • Remove potential field defects by focusing systems/integration testing • Higher quality product • Earlier defect detection (when its cheaper to fix) • Plan for maintenance by predicting the number of field defects within the first year • Faster response for customers • Less stress (e.g. for developers) • More accurate budgets

  5. ABB’s risk mitigation activities include… • Remove potential field defects by focusing systems/integration testing • Higher quality product • Earlier defect detection (when its cheaper to fix) • Plan for maintenance by predicting the number of field defects within the first year • Faster response for customers • Less stress (e.g. for developers) • More accurate budgets • Plan for future process improvement efforts by identifying important characteristics (i.e. category of metrics) • More effective improvement efforts

  6. Practical issues • How to conduct analysis with incomplete information? • How to select an appropriate modeling method for systems/integration testing prioritization and process improvement planning? • How to evaluate the accuracy of predictions across multiple releases in time? Experience on how we got to the results

  7. Talk outline • ABB systems overview • Field defect modeling overview • Outputs • Inputs • Insight 1: Use available information • Modeling methods • Insight 2: Select a modeling method based on explicability and quantifiability • Insight 3: Evaluate accuracy using forward prediction evaluation • Empirical results • Conclusions

  8. Product A Real-time monitoring system Growing code base of ~300 KLOC 13 major/minor/fix-pack releases ~127 thousand changes committed by ~ 40 different people Dates back to 2000 (~5 years) Product B Tool suite for managing real-time modules Stable code base of ~780 KLOC 15 major/minor/fix-pack releases ~50 people have worked on the project Dates back to 1996 (~9 years) We examined two systems at ABB

  9. We collected data from… • Request tracking system (Serena Tracker) • Version control system (Microsoft Visual Source Safe) • Experts (e.g. team leads and area leads)

  10. Talk outline • ABB systems overview • Field defect modeling overview • Outputs • Inputs • Insight 1: Use available information • Modeling methods • Insight 2: Select a modeling method based on explicability and quantifiability • Insight 3: Evaluate accuracy through time using forward prediction evaluation • Empirical results • Conclusions

  11. Predictors (metrics available before release) Product (Khoshgoftaar and Munson, 1989) Development (Ostrand et al., 2004) Deployment and usage (Mockus et al., 2005) Software and hardware configurations (Li et al., 2006) Breakdown of field defect modeling Inputs

  12. Breakdown of field defect modeling Inputs Modeling method Metrics-based methods

  13. Breakdown of field defect modeling Inputs Modeling method Outputs Field defects

  14. Modeling process Inputs Modeling method Outputs Take historical Inputs and Outputs to construct model

  15. Talk outline • ABB systems overview • Field defect modeling overview • Outputs • Inputs • Insight 1: Use available information • Modeling methods • Insight 2: Select a modeling method based on explicability and quantifiability • Insight 3: Evaluate accuracy through time using forward prediction evaluation • Empirical results • Conclusions

  16. Outputs • Field defects: valid customer reported problems attributable to a release in Serena Tracker • Relationships • What predictors are related to field defects? • Quantities • What is the number of field defects?

  17. Outputs • Field defects: valid customer reported problems attributable to a release in Serena Tracker • Relationships • Plans for improvement • Targeted systems testing • Quantities • Maintenance resource planning Remember these objectives

  18. Talk outline • ABB systems overview • Field defect modeling overview • Outputs • Inputs • Insight 1: Use available information • Modeling methods • Insight 2: Select a modeling method based on explicability and quantifiability • Insight 3: Evaluate accuracy through time using forward prediction evaluation • Empirical results • Conclusions

  19. Product metrics Lines of code Fanin Halstead’s difficulty … Development metrics Open issues Deltas Authors … Deployment and usage (DU) metrics … we’ll talk more about this Software and hardware configuration (SH) metrics Sub-system Windows configuration … Inputs (Predictors)

  20. Insight 1: use available information • ABB did not officially collect DU information (e.g. the number of installations) • Do analysis without the information? • We collected data from available data sources that provided information on possible deployment and usage • Type of release • Elapsed time between releases • Improved validity • More accurate models • Justification for better data collection

  21. Talk outline • ABB systems overview • Field defect modeling overview • Outputs • Inputs • Insight 1: Use available information • Modeling methods • Insight 2: Select a modeling method based on explicability and quantifiability • Insight 3: Evaluate accuracy through time using forward prediction evaluation • Empirical results • Conclusions

  22. Methods to establish relationships • Rank Correlation (for improvement planning) • Single predictor • Defect modeling (for improvement planning and for systems/integration test prioritization) • Multiple predictors that complement each other

  23. Insight 2: select a modeling method based on explicability and quantifiability • Previous work use accuracy, however… • To prioritize product testing • Identify faulty configurations • Quantify relative fault-proneness of configurations • For process improvement • Identify characteristics related to field defects • Quantify relative importance of characteristics

  24. Insight 2: select a modeling method based on explicability and quantifiability • Previous work use accuracy, however… • To prioritize product testing • Identify faulty configurations • Quantify relative fault-proneness of configurations • For process improvement • Identify characteristics related to field defects • Quantify relative importance of characteristics Explicability

  25. Insight 2: select a modeling method based on explicability and quantifiability • Previous work use accuracy, however… • To prioritize product testing • Identify faulty configurations • Quantify relative fault-proneness of configurations • For process improvement • Identify characteristics related to field defects • Quantify relative importance of characteristics Quantifiability

  26. Insight 2: select a modeling method based on explicability and quantifiability • Previous work use accuracy, however… • To prioritize product testing • Identify faulty configurations • Quantify relative fault-proneness of configurations • For process improvement • Identify characteristics related to field defects • Quantify relative importance of characteristics Not all models have these qualities e.g. Neural Networks, models with Principal Component Analysis

  27. The modeling method we used • Linear modeling with model selection • 39% less accurate than Neural Networks (Khoshgoftaar et al.) example only: not a real model

  28. The modeling method we used • Linear modeling with model selection Function (Field defects) = B1*Input1 + B2 *Input2 + B3 *Input4 Explicability: distinguish the effects of each predictor example only: not a real model

  29. The modeling method we used • Linear modeling with model selection Function (Field defects) = B1*Input1 + B2 *Input2 + B3 *Input4 Quantifiability: compare the effects of predictors example only: not a real model

  30. Talk outline • ABB systems overview • Field defect modeling overview • Outputs • Inputs • Insight 1: Use available information • Modeling methods • Insight 2: Select a modeling method based on explicability and quantifiability • Insight 3: Evaluate accuracy through time using forward prediction evaluation • Empirical results • Conclusions Skipping ahead… Read the paper

  31. Systems/Integration test prioritization Log (Field defects) = B1 Input1 + B2 Input2 …

  32. Systems/Integration test prioritization • Select a modeling method based on explicability and quantifiability

  33. Systems/Integration test prioritization • Select a modeling method based on explicability and quantifiability

  34. Systems/Integration test prioritization • Select a modeling method based on explicability and quantifiability

  35. Systems/Integration test prioritization • Select a modeling method based on explicability and quantifiability • Use available information • Improved validity • More accurate model • Justification for better data collection

  36. Systems/Integration test prioritization • Experts validated results • Quantitative justification for action • ABB found additional defects

  37. Talk outline • ABB systems overview • Field defect modeling overview • Outputs • Inputs • Insight 1: Use available information • Modeling methods • Insight 2: Select a modeling method based on explicability and quantifiability • Insight 3: Evaluate accuracy through time using forward prediction evaluation • Empirical results • Conclusions

  38. Risk mitigation activities enabled • Focusing systems/integration testing • Found additional defects • Plan for maintenance by predicting the number of field defects within the first year • Do not yet know if results are accurate enough for planning purposes • Plan for future process improvement efforts • May combine with prediction method to enable process adjustments

  39. Experiences recapped • Use available information when direct/preferred information is unavailable • Consider explicability and quantifiability of a modeling method when objectives are improvement planning and test prioritization • Use forward prediction evaluation procedure to assesses accuracy of prediction for multiple releases in time Details on insights and results in our paper

  40. Thanks to: Ann Poorman Janet Kaufman Rob Davenport Pat Weckerly Experiences and Results from Initiating Field Defect Prediction and Product Test Prioritization Efforts at ABB Inc. Paul Luo Li (Carnegie Mellon University) James Herbsleb (Carnegie Mellon University) Mary Shaw (Carnegie Mellon University) Brian Robinson (ABB Research)

  41. Insight 3 : use forward prediction evaluation • Accuracy is the correct criterion when predicting the number of field defects for maintenance resource planning • Current accuracy evaluation methods are not well-suited for multi-release systems • Cross-validation • Random data with-holding

  42. Insight 3 : use forward prediction evaluation • Accuracy is the correct criterion when predicting the number of field defects for maintenance resource planning • Current accuracy evaluation methods are not well-suited for multi-release systems • Cross-validation • Random data with-holding Release 1 Release 2 Release 3 Release 4

  43. Insight 3 : use forward prediction evaluation • Accuracy is the correct criterion when predicting the number of field defects for maintenance resource planning • Current accuracy evaluation methods are not well-suited for multi-release systems • Cross-validation • Random data with-holding Release 1 Release 2 Release 3 Release 4

  44. Insight 3 : use forward prediction evaluation • Accuracy is the correct criterion when predicting the number of field defects for maintenance resource planning • Current accuracy evaluation methods are not well-suited for multi-release systems • Cross-validation • Random data with-holding Release 1 Release 2 Release 3 Release 4

  45. Insight 3 : use forward prediction evaluation • Accuracy is the correct criterion when predicting the number of field defects for maintenance resource planning • Current accuracy evaluation methods are not well-suited for multi-release systems • Cross-validation • Random data with-holding Release 1 Release 2 Release 3 Release 4

  46. Insight 3 : use forward prediction evaluation • Accuracy is the correct criterion when predicting the number of field defects for maintenance resource planning • Current accuracy evaluation methods are not well-suited for multi-release systems • Cross-validation • Random data with-holding Release 1 Release 2 Release 3 Release 4

  47. Insight 3 : use forward prediction evaluation • Accuracy is the correct criterion when predicting the number of field defects for maintenance resource planning • Current accuracy evaluation methods are not well-suited for multi-release systems • Cross-validation • Random data with-holding Release 1 Release 2 Release 3 Release 4

  48. Insight 3 : use forward prediction evaluation • Accuracy is the correct criterion when predicting the number of field defects for maintenance resource planning • Current accuracy evaluation methods are not well-suited for multi-release systems • Cross-validation • Random data with-holding Release 1 Release 2 Release 3 Release 4

  49. Insight 3 : use forward prediction evaluation • Accuracy is the correct criterion when predicting the number of field defects for maintenance resource planning • Current accuracy evaluation methods are not well-suited for multi-release systems • Cross-validation • Random data with-holding Release 1 Release 2 Release 3 Release 4

  50. Insight 3 : use forward prediction evaluation • Accuracy is the correct criterion when predicting the number of field defects for maintenance resource planning • Current accuracy evaluation methods are not well-suited for multi-release systems • Cross-validation • Random data with-holding • Only a non-random sub-set is available • Predicting for a past release is not the same as predicting for a future release Not realistic!

More Related