1 / 37

Empirical Validation of OO Metrics in Two Different Iterative Software Processes

Empirical Validation of OO Metrics in Two Different Iterative Software Processes. Mohammad Alshayeb Information and Computer Science Department King Fahd University of Petroleum and Minerals http://www.alshayeb.com.

idalee
Télécharger la présentation

Empirical Validation of OO Metrics in Two Different Iterative Software Processes

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Empirical Validation of OO Metrics in Two Different Iterative Software Processes Mohammad Alshayeb Information and Computer Science Department King Fahd University of Petroleum and Minerals http://www.alshayeb.com Alshayeb, M. and Wei Li, “An Empirical Validation of Object-Oriented Metrics in Two Iterative Processes,” IEEE Transactions on Software Engineering, Vol. 29, No. 11, November 2003.

  2. Agenda • Research Objective. • Introduction. • Research Data. • Research Hypotheses. • Validation Studies. • Conclusion. • Future Work. 2

  3. Objective • Validate the Predictive Capability of Object-Oriented (OO) Metrics in Two Iterative Software Processes. 3

  4. Measurement Acceptance • No Wide Acceptance of Software Measurements Because: • Management personnel rarely know how to analyze, interpret or use the software metrics. • Various ways of analyzing software metrics. • Lack of measurement validation. 4

  5. Measurement Validation • Validation Assures That Measurements Assess What They Are Supposed to Measure. • Types of Validation: • Theoretical: the measure is a proper numerical characterization of a claimed attribute. • Empirical: establishes the accuracy of a prediction system by comparing the model performance with known data in a given environment. 5

  6. Software Processes • The Waterfall Model • Prototype Model • Rapid Application Development Model • The Incremental Model • The Spiral Model • Component Assembly Model • The Iterative Development Process Model • Extreme Programming 6

  7. Domain Analysis Customer Requirements Requirements definition Software Architecture Iteration Risk Assessment Prototype Highest Risk Test Suite & Environment Development Integrate with Previous Iteration Release Iteration The Iterative Development Process 7 Luckey et al. , “Iterative Development Process with Proposed Applications,” 1992

  8. Agile Process: Extreme Programming (XP) • “A lightweight, efficient, low-risk, flexible, predictable, scientific, and fun way to develop software” -Beck, • XP Applicability: • Projects with vague or changing requirements. • Small groups of programmers. • The need for customer input. 8

  9. XP Is Different • Continuity of Feedback Throughout Its Short Cycles. • Incremental Planning. • Flexibility in Scheduling on the Basis of Business Needs. • Dependency on Unit Tests Written by Both Developers and Customers. • Pair Programming. 9

  10. Customer System Division of system into user stories Story 1 Story 2 Story 3 Priority Organization of user stories Story 2 Story 3 Story 1 Iterations assigned for the development of each story Iteration 2 Iteration 3 Iteration 1 Release 3 Release 2 Task 1 Task 3 Task 2 Test Case 1 Test Case 2 Test Case 3 Passed, integrate with the system Release 1 Working, tested system with story 2 integrated XP Breakdown 10

  11. Analysis Design Waterfall Long Cycled Iterative Process Short Cycled Iterative Process XP Implementation Testing XP and Other Software Processes Beck, “Embracing changes with Extreme programming,” 1999 11

  12. Frameworks • “A Framework is an OO class hierarchy plus a built-in model which defines how the objects derived from the hierarchy interact with one another” –Lewis et al, 12

  13. OO Metrics Suites • Li Suite • Number of Ancestor Classes (NAC) • Number of Local Methods (NLM) • Class Method Complexity (CMC) • Number of Descendent Classes (NDC) • Coupling Through Abstract data types (CTA) • Coupling Through Message passing (CTM) C&K suite • Weighted Methods per Class (WMC) • Depth of Inheritance Tree (DIT) • Number Of Children (NOC) • Coupling Between Objects (COB) • Response For a Class (RFC) • Lack of Cohesion of Methods (LCOM) 13

  14. Iterative Software Process Process: Extreme Programming (XP) Process: the Framework Iterative Data: Two Java Applications Data: JDK Releases Research Data 14

  15. Research Data 15

  16. Data Collection • The tasks planned for the current iteration cycle. The tasks come from the user stories, which are supplied by the customer. • A description of the progress or failure of progress towards completing the tasks. • The time spent on each task measured by hours. • A description of the problems encountered during implementing a task and the attempted solutions. • A description of the changes made on the system, the reasons for the changes, and the affected classes. 16

  17. Data Analysis • Tools support • Webgain’s Quality AnalyzerMetamata Metrics • Object-Oriented metrics collection. • Collects 12 metrics. • Krakatau Project Manager for Java • Collecting development/maintenance effort measured by lines of code. 17

  18. Metrics Validation Alshayeb and Li Process? 18

  19. Multiple Regression • Investigate the Relationship Between a Dependent Variable and Multiple Independent Variables y = b0 + b1x1 +b2x2 + . . . + bkxk+e • The x’s are the independent (predictor or regressor) variables. • y is the dependent (response) variable. • e —the statistical error—is the difference between the observed value y and the line (b0 + b1x1 +b2x2 + . . . + bkxk ) • b: regression coefficient. • R2: The variance percent in the dependent variable based on the knowledge of the independents. 19

  20. Empirical Validation of Object-Oriented Metrics • Empirically Validate the Predictive Capability of OO Metrics in the Long-Cycled (Framework) and Short-Cycled (Agile) Iterative Software Processes • Investigate the relationship between metrics and development/maintenance efforts in two forms of iterative processes. • Types of Changes: • Lines added (LA), Lines changed (LC), and Lines deleted (LD). • Refactoring and Error-Fix efforts. 20

  21. Example of Lines Added, Deleted and Changed Deleted Added Changed 21

  22. Research Hypotheses • Hypothesis 1:Using OO metrics, we can predict design changes indicated by added, changed, and deleted source lines of code in classes from one iteration (release) to the next in the long-cycled framework iterative process. • Hypothesis 2: Using OOmetrics, we can predict added, changed, and deleted source lines of code in classes from one iteration to the next in the short-cycled XP iterative process. 22

  23. Research Hypotheses • Hypothesis 3: Using OO metrics, we can predict error-fix maintenance effort (measured in man-hours) in classes from one iteration to the next in the short-cycled XP iterative process. • Hypothesis 4: Using OO metrics, we can predict refactoring effort (measured in man-hours) in classes from one iteration to the next in the short-cycled XP iterative process. 23

  24. Dependant and Independent Variables -Hypotheses 1 and 2 • Dependant Variables Are: • LA, LC, and LD. • Independent Variables • WMC, DIT, LCOM, NLM, CTA, and CTM metrics. 24

  25. Research Results -Hypothesis 1 • Hypothesis 1 25

  26. Research Results -Hypothesis 1 • Object-Oriented Metrics Cannot Predict Design Changes Indicated by Added, Changed, and Deleted, Source Lines of Code in Classes From One Iteration (Release) to the Next in the Long-cycled Framework Iterative Process. 26

  27. Research Results -Hypothesis 2 • Hypothesis 2 27

  28. Research Results -Hypothesis 2 • Hypothesis 2 28

  29. Research Results -Hypothesis 2 • Object-Oriented Metrics May Predict Design Changes Indicated by Added, Changed, and Deleted Source Lines of Code in Classes From One Iteration to the Next in the Short-cycled XP Iterative Process Only When System Design Accumulates to Certain Mass. 29

  30. Dependant and Independent Variables -Hypotheses 3 and 4 • Dependant Variables Are: • Refactoring and Error-Fix efforts. • Independent Variables • WMC, DIT, LCOM, NLM, CTA, and CTM metrics. 30

  31. Research Results –Hypotheses 3,4 • Hypotheses 3 and 4 31

  32. Research Results –Hypotheses 3,4 • Hypotheses 3 and 4 32

  33. Research Results –Hypotheses 3,4 • Object-oriented Metrics Can Predict the Refactoring Effort and Error-fix Maintenance Effort in the XP-like Process When the System Design Accumulates to Certain Mass. 33

  34. Conclusion • The Predictive Capability of OO Metrics: • OO metrics are reasonably effective in predicting design changes in the short-cycled XP-like process, but they are largely ineffective in the long-cycled framework evolution process. • OO metrics' predictive capability is limited to the design and implementation changes during the development iterations, not the long-term evolution of an established system in different releases. • OO metrics are effective in predicting error-fix and refactoring efforts in the short-cycled agile process. 34

  35. Conclusion • The Development Software Process is Related Directly to the Predictive and Project-Progress Indicative Capabilities of OO Metrics: • It is thus essential to clearly identify the software process when validating OO metrics. 35

  36. Future Work • Studies Can Target OO Metrics Capabilities in Different Software Processes. • Validation Studies More Diverse Data Sets Are Needed to Test the Prediction Accuracy. • Automate the Steps Followed in This Study and Build a Tool to Gather, to Analyze, and to Use Metrics and the Statistical Models in a CASE Tool. 36

  37. Questions 37

More Related