1 / 41

SMU CSE 8314 Software Measurement and Quality Engineering

SMU CSE 8314 Software Measurement and Quality Engineering. Module 28 More Recommended Measures and Some Special Issues. Outline. SEI CMM/CMMI Measures Aligning Measures at Different Levels in the Organization Other Special Issues. SEI CMM/CMMI Measures. The Range of SEI Measures.

daisy
Télécharger la présentation

SMU CSE 8314 Software Measurement and Quality Engineering

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. SMU CSE 8314 Software Measurement and Quality Engineering Module 28 More Recommended Measures and Some Special Issues

  2. Outline • SEI CMM/CMMI Measures • Aligning Measures at Different Levels in the Organization • Other Special Issues

  3. SEI CMM/CMMI Measures

  4. The Range of SEI Measures • SEI has several measurements efforts which do not always seem to be consistent • These include (among others): • SW Metrics for acquisition managers • Software engineering database measures • Core measures for DoD systems • SEI metrics project • Measures related to the CMM/CMMI

  5. Measures Described in The Capability Maturity Model (CMM) Key Process Areas for Each Level of the Model Level 5 - Optimizing Level 4 - Managed Level 3 - Defined Level 2 - Repeatable Level 1 - Initial

  6. Key Process Areas • These are disciplines that must be mastered in order to achieve a given level of maturity • Each KPA has the following structure: • Goals that must be established by the org. • Commitments that the organization must make in order to achieve the goals (this is measured by such factors as money and people and relaxing of other goals) • Key Practices that demonstrate the organization is doing the KPA properly

  7. Monitors • These are things you should measure in order to assure that you are • Carrying out the key practices, or • Achieving the goals, or • Living up to the commitments • Each of these is associated with recommended measures

  8. SEI CMM/CMMI Measures Measures Recommended for Level 2

  9. 1) Tracking Performance Progress: Actual vs. Schedule Performance Effort: Total Staff Hours Expended vs. Plan Cost: Total Cost vs. Plan See SMU Course CSE7315 - SW Project Management

  10. Requirements Stability: Changes in requirements -- similar to “Requirements Volatility”, discussed in modules on Software Reliability Size Stability: Changes in estimated software size Process Stability: Changes in process 2) Requirements Management

  11. 3) Quality Assurance Audit Results: Actual vs. goal for defects, process problems Review Results: Similar to above Peer Review Results: Similar to above Trouble Reports: Total vs. plan or historical experience

  12. 4) Project Planning and Tracking Computer Resource Utilization: Do you have enough to do the job? This is a product metric

  13. CMM Measures vs. Maturity Level • The same data may be analyzed differently at different maturity levels, e.g. “Trouble Reports” • Level 2: • Number of reports • Number of open vs. closed reports • Level 3: • Number compared with historical data • Time required to close the reports • Level 4: • Pareto analysis and root cause analysis

  14. Pareto Analysis • Sort data by most frequent or most costly See Andersen, Chapter 6 This will be covered in more detail in a later module

  15. Root Cause Analysis • What happened? • Where (in the process) did it happen? • Why did it happen? • How can we change the process so it won’t happen again? See Andersen (all) This will be covered in more detail in a later module

  16. Aligning Measures at Different Levels in the Organization

  17. Each Level in the Organization has Different Purposes • Each level needs different measures • Goals are different • Questions are different • Appropriate level of attention is different • Sometimes they interpret the same information in different ways • Sometimes the same information need may have different measures at different organizational levels

  18. Same Information Need,Different Measures Information Need: Cycle Time Software Developer: Measure length of software development cycle for each product (Goal: Improve efficiency of my software development) Program Manager: Measure length of development process for entire program (Goal: Improve efficiency of my program) Software Process Owner: Measure mean and variance of software development cycles for all products (Goal: Competitive advantage for my process) Division Manager: Measure mean and variance of development process for all projects (Goal: Competitive advantage for my division)

  19. Same Data, Multiple Uses & Interpretations Software Manager Program Manager Division Manager Engineer How Data are Used Understand & Change Work Products Understand Trends, Potential Problems Prepare Plans, Schedules, Estimates Prepare Business Strategies & Plans When Data are Needed within Seconds, Minutes, Hours, Days Hourly, Daily, Weekly Weekly, Monthly Quarterly, Annually

  20. Plan Measures for Maximum Utility • Measurement costs money • For collection, storage, analysis • Consequences of incorrect data • People fear measurement • The more things you measure, the less support you get for measurement • People will resist measuring things that do not benefit them directly

  21. Aligning Measures at Different Levels in the Organization A Suggested Method

  22. Objectives • Try to arrange things so that data collected and utilized at one level can be used at the next level (perhaps after some analysis or consolidation). • Make sure each item measured provides a specific benefit to the measurer. • Address needs at all levels. • Avoid requiring people to measure things that do not give them specific benefits.

  23. User Development Team Project Manager Engineer etc. .... Metric Process Cost Goal, Measure Goal, Measure Goal, Measure Goal, Measure Process Quality Goal, Measure Goal, Measure Goal, Measure Goal, Measure Process Cycle Time Goal, Measure Goal, Measure Goal, Measure Goal, Measure Product Cost Goal, Measure Goal, Measure Goal, Measure Goal, Measure Etc. .......... Goal, Measure Goal, Measure Goal, Measure Goal, Measure Step 1 - Define Goals & Measures

  24. Step 2 - Define Flow of Data Cycle Time for This Product Cycle Time Mean & Variance Cycle Time for This Product Cycle Time for This Product Actual Data Cycle Time for This Product Actual Data Cycle Time for This Product Actual Data Actual Data Actual Data Project Engineering Teams Program Managers Process Owner

  25. Focus on the Measures Appropriate for Your Level Use lower level measures only when you need to understand WHY your measures have unsatisfactory values

  26. Example of Use of Lower Level Measures • Problem: program cycle time is too long • Approach: program manager might look at individual sub-process cycle times to learn where the problems are • But note that the problems may be in the handoff between sub-processes, or in constraints imposed by the program manager • Problem areas need help, not criticism

  27. Focus on the Measures Appropriate for Your Level(Continued) Use higher level measures only to see the overall effects of your efforts • How is my division doing in cycle time? • Am I leading or trailing?

  28. Other Special Issues

  29. Selection of Measures is Important • It takes time and effort to collect, data, analyze the results and respond • The recommended approach is to focus on a small number of useful measures • Track the things that relate to significant project risks, such assize, cost, schedule, or defects

  30. Measurements are Intrusive • Individual software developers may resent being measured • Get them involved in definition of measures • Do not use measures to find fault with individuals • Generally, the fault is with the process, the procedures, the tools, or the estimates

  31. What is Supposed to Happen What Actually Happened What You Think Actually Happened Measure the Right Thing Watts Humphrey’s model of why it is easy to measure the wrong thing

  32. Other Pitfalls of Measurement If you measure people without their knowing it • You may get accurate data until they find out about it • Then you lose accuracy AND the trust of the people

  33. A Preferred Approach ... • Educate everyone in the principles of process improvement • Work with the software staff to develop effective measures that they do not fear and will not “fudge” • Collect and analyze data in a cooperative manner

  34. When You Use the Data ... • Use the data ONLY to evaluate the process • Demonstrate use of the data • so people see it in action, as you told them it would be used

  35. More Pitfalls You can measure against goals or You can measure to determine facts about the process

  36. If you Measure Against Goals ... … your staff may change behavior and the process to improve the measurement But you want them to change the process to improve actual performance Make sure you are asking for the right things.

  37. Summary • SEI CMM/CMMI has a number of recommended measures • Which may be used differently at different levels of maturity • Align measures for different parts of the organization • To increase value and decrease cost of collection • Selection of measures deserves careful planning and analysis • Understand the impact on people being measured

  38. In Future Modules ... • The measurement process • Measuring the software development process • Tying measures to the software development process • Root cause analysis

  39. References • Andersen, Bjorn and Tom Fagerhaug, Root Cause Analysis, ASQ Quality Press, Milwaukee, Wisconsin, USA, ISBN-13: 978-0-87389-692-4 • Basili, Victor & David Weiss, “A Methodology for Collecting Valid Software Engineering Data,” IEEE Transactions on Software Engineering, Vol SE-10, no. 6, Nov., 1984, p728. • Grady, Robert B. and Deborah L. Caswell, Software Metrics: Establishing a Company-Wide Program. Englewood Cliffs, N.J., Prentice-Hall, Inc., 1987. ISBN 0-13-821844-7.

  40. References (continued) • Weinberg, Gerald M. Quality Software Management, Volume 1, Systems Thinking, Dorset House, New York, 1992. ISBN: 0-932633-22-6. • Weinberg, Gerald M. and Schulman, Human Factors, 1974, Vol 16 #6, pp. 70-77

  41. END OF MODULE 28

More Related