1 / 67

SMU CSE 8314 Software Measurement and Quality Engineering

SMU CSE 8314 Software Measurement and Quality Engineering. Module 29 Measuring and Improving the Software Development Process. Outline. The Measurement Process Tying Measures to the Software Development Process Root Cause Analysis. The Measurement Process. Core Process.

dong
Télécharger la présentation

SMU CSE 8314 Software Measurement and Quality Engineering

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. SMU CSE 8314 Software Measurement and Quality Engineering Module 29 Measuring and Improving the Software Development Process

  2. Outline • The Measurement Process • Tying Measures to the Software Development Process • Root Cause Analysis

  3. The Measurement Process

  4. Core Process Plan the Measure-ment Process The ISO 15939 Software Measurement Process Model Project Management and Software Development Information Needs Information Evaluate Measure-ments Establish and Sustain Commit-ment Perform the Measure-ment Process Information & Evaluation Results Measurement Data Base Improvement Actions

  5. Plan Perform Core Measurement Process Plan Collect Analyze Utilize Manage Risks on the Project

  6. The Measurement Process Plan • Define goals & information needs • Work with customer & stakeholders • Prioritize and align • Select measures • See prior module on selecting SW measures • Plan the data collection, analysis and reporting procedures • Define criteria for evaluation • Obtain approval for resources • Deploy supporting technologies

  7. The Measurement Process Collect • Integrate the data collection process into the software & management processes • Consider the need for culture change • Consider the impact of data collection • Communicate the process to affected parties • Deploy the collection process • Collect & verify the data • Record and store the data

  8. The Measurement Process Analyze • Analyze the Data • Interpret relative to models, etc. • Report/Communicate the Results • Document • Display/Communicate to users • Interpret

  9. Evaluate Measurements (not part of core process) The Measurement Process Utilize • Change the project • Identify trends early • Adjust plans to head off problems • Change the process • At the project level • At the organizational level • Change the measurement process • Data definitions, collection process, validation, etc.

  10. Each Measure Must Have an Objective and a Customer • Objective must be clear • Otherwise, people will not know why and may resist or may collect the wrong thing • Interpretation of the data must be clear • Otherwise, measurements can be misused and misinterpreted • End user (customer) of the measurement must want the information • Otherwise, all of the effort is for no purpose

  11. The Objective can be Stated in terms of Three Elements • To understand / evaluate / control / predict/ report • An attribute of an entity (resource, process, product) • In order to satisfy a purpose (related to the goal of the measure)

  12. Construct a Sentence for Each Measure “We are measuring MMM so we can OOO the AAA of the EEE in order to PPP” • MMM is a measure • OOO is an objective • AAA is an attribute • EEE is an entity • PPP is a purpose This idea is due to Linda Westfall (see references) “We are measuring the number of lines of code so we can understand the size of the software in order to predict the cost and schedule of the project.”

  13. Purpose and Use must be Communicated to All Concerned • If people do not know why they are being measured, they will mistrust the measurements and generate bad data

  14. Use of Measurements must be Demonstrated

  15. Integrating Measures and Data Collection with the Software Development Process

  16. Who Cares about What? • Managers -- Project measures • That’s how they are evaluated • But if the project is in trouble they need to know more • Developers -- Productmeasures • That’s how they are evaluated • Both should care about Process measures • This is usually where you learn the reasons for a problem

  17. What Should we Measure? Process Product Project determines quality of determines success of Process Measures • How effective is the process? • How well are we following the process? • Risk monitoring root causes root causes • Product Measures • Performance and quality • How well is the product meeting its requirements? • Project Measures • Used to monitor the state of the project • How are we doing relative to cost, schedule, staffing, etc.?

  18. Example: Same Base Measure, Different Uses Base Measure: Number of defects in the software Use in a Project Measure: number of defects must be less than target before project is complete, thus the data are used to measure project status. Use in a Product Measure: product quality index is “defects per 1000 LOC”. Used to determine probable warranty cost. Use in a Process Measure: quality index is compared for different processes and for process improvements to determine which processes are best for future projects in the organization

  19. How do you Fix Problems Identified by Project Measures? • An effective solution is usually based on analyzing and fixing the process

  20. Questions to help identify process measures Example • Project measure: schedule performance • Problem: schedule performance is poor (we are behind schedule) • Solution: understand why the process is not achieving the desired schedule • Are we not following the process? • Is the process inefficient? • Are people poorly trained? • Is the process ill suited to this project?

  21. Example: A Measure & its Impact Information Need: Productivity Measure 1: Lines of code per day Use: reward those who produce the most lines of code per day Result: people produce bloated, inflated code in order to look good Measure 2: Requirements met and tested, weighted by complexity of requirement Use: track against history and use to identify process bottlenecks Result: people will use the data to make the process more efficient, resulting in lower cost

  22. Good and Not-So-Good Measures Goal:Produce software more efficiently Information Needed:Efficiency Measure 1: tests completed per week Result: easy tests done first; corners cut in testing; hard problems ignored or deferred Measure 2: rework Result: process and methods are improved to reduce rework, resulting in more efficient software development • But rework is a lagging indicator - it does not spot problems in advance

  23. What Attributes Can We Measure? • We want attributes that relate to our goals • time, resources, performance, quality etc. • The following type of matrix can help: What Process Product Project Attributes How Fast can we Manufacture? What Is our Cycle Time? Are We On Schedule? Time What will it Cost? What is our Productivity? Expenses vs. Budget? Resources Does it Work? Meets Perf. Goals? Meets Mgt. Goals? Performance In-process Defects? Post-release Defects? Customer Satisfaction? Quality

  24. Tie Measures to the Process -- A Measure for Every Phase Phase Planning Requirements Integration Design Coding Test Measure Staffing X X X X X X Requirements Stability X X X Design Complexity X Code Complexity X Etc.

  25. Plan the Measurement Process to be Part of the Software Process • The software process and associated procedures define the day-to-day actions • Thus they are the ideal place to communicate details of how data should be collected & evaluated • The software process is defined in terms of tasks to be done, inputs and outputs, and sequencing

  26. Typical Process Description Place Module under Configuration Control defects Design Software Module Inspect Design Place Tests under Configuration Control Software Design Review Design Module Test Cases

  27. Inspect Design Track Defects to Closure defects Place Module under Configuration Control Design Software Module Inspect Design Design Module Test Cases Place Tests under Configuration Control Software Design Review “Inspect Design”Detailed Process Description Identify Inspection Participants Decide if OK to Release to CM Hold Inspection Meeting Record Defects Detected

  28. Record Defects Detected “Record Defects Detected”(text description of task) • For each defect, identify the following: • Type • Origin • Severity • Date and Time Found • Person assigned to Correct • Estimated effort to Correct • Record the above in the defect database • When defect is corrected, record date and time corrected and total effort to correct

  29. Tie the Software Process to the Measures • Consider measures for each process task or procedure or input or output • Why (Goal of this measure) • What to collect (base measures; clear definitions) • How to collect (forms, procedures, automation) • Responsibility for collection (individuals, organizations) • How it will be used (analysis, formulas, etc.)

  30. Tie the Software Process to the Measures (continued) • Add this information to the process description -- usually as part of the text description of the bottom level task • But don’t overdo it • And make sure there is a concrete benefit for each measurement being recommended.

  31. Another ExampleFocusing on Status Measures Process for Design Walkthrough 1) Collect design documents 2) … (walkthrough details)... 3) … (walkthrough details)... 4) Identify Defects 5) Categorize by Priority and Type 6) Document in Walkthrough Report 7) Define action plan for Each Defect 8) Assign each defect to Analyst for Resolution 9) Report status each week until closure, using “open defects” report

  32. Typical Graph of Status Data Open Defects Report for module XXX123

  33. Root Cause Analysis

  34. What Is Root Cause Analysis • Finding the real, underlying cause of a problem rather than just dealing with the symptoms • Used for problems that occur often and consume a lot of resources • The most expedient solution to any problem is usually to deal with symptoms • root cause analysis provides a more lasting and, ultimately, more cost effective solution

  35. From systems-thinking.org To find root causes there is one really only one question that's relevant, "What can we learn from this situation?" Research has repeatedly proven that unwanted situations within organizations are about 95% related to process problems and only 5% related to personnel problems. Yet, most organizations spend far more time looking for culprits than causes and because of this misdirected effort seldom really gain the benefit they could gain from understanding the foundation of the unwanted situation.

  36. Example • Problem: expediting certain jobs causes others to be delayed, resulting in overall loss of efficiency • Root cause: why are we expediting those jobs in the first place? Why is the root cause not usually addressed? Because the person in charge of solving the problem (expediting the jobs) isn’t being asked to look at the whole picture

  37. Example of Effective RCA (1 of 5) (From systems-thinking.org) • The Plant Manager walked into the plant and found oil on the floor. <<< Symptom • He called the Foreman over and asked him why there was oil on the floor. • The Foreman indicated that it was due to a leaky gasket in the pipe joint above. <<< Cause • The Plant Manager then asked when the gasket had been replaced and the Foreman responded that Maintenance had installed 4 gaskets over the past few weeks and each one seemed to leak.

  38. Example of Effective RCA (2 of 5) (From systems-thinking.org) • The Foreman also indicated that Maintenance had been talking to Purchasing about the gaskets because it seemed they were all bad. <<< More fundamental cause • The Plant Manager then went to talk with Purchasing about the situation with the gaskets. The Purchasing Manager indicated that they had in fact received a bad batch of gaskets from the supplier.

  39. Example of Effective RCA (3 of 5) (From systems-thinking.org) • The Purchasing Manager also indicated that they had been trying for the past 2 months to try to get the supplier to make good on the last order of 5,000 gaskets that all seemed to be bad. • The Plant Manager then asked the Purchasing Manager why they went with the lowest bidder and he indicated that was the direction he had received from the VP of Finance. <<< More fundamental cause

  40. Example of Effective RCA (4 of 5) (From systems-thinking.org) • The Plant Manager then went to talk to the VP of Finance about the situation. • When the Plant Manager asked the VP of Finance why Purchasing had been directed to always take the lowest bidder the VP of Finance said, "Because you indicated that we had to be as cost conscious as possible!" and purchasing from the lowest bidder saves us lots of money.

  41. Example of Effective RCA (5 of 5) (From systems-thinking.org) • The Plant Manger was horrified when he realized that he was the reason there was oil on the plant floor. Root Cause!!! Note the string of causes, across multiple parts of the organization, before the manager found the underlying, root cause

  42. Some Methods of Root Cause Analysis • Cause and Effect Charts • The Five Why’s • Keep asking “why” until you find the underlying reason for a problem • Fault Tree Analysis • Matrix Diagrams • Similar to QFD • There are many other methods (see Andersen)

  43. Root Cause Analysis for Assignment 5 • After you have identified the most important problems, you will be expected to use Root Cause Analysis to determine the underlying causes of these problems • You will be expected to read Andersen and use techniques found there (at least two of them) in your analysis in Assignment 5 • This will come AFTER you have done the value-added analysis and the cost-of-quality analysis

  44. Root Cause Analysis Applied to Defects • Used to identify causes of defects so you can change the process to prevent them • The methods can also be used to trace forward: • given a defect, what problems might it cause • Unfortunately, the terminology differs greatly among various authors.

  45. IEEE Standard for Defect Classification • See IEEE standards for latest status • Rather elaborate and comprehensive • Tends to be rather generic and therefore hard to apply to specific projects • This illustrates one of the problems with standards -- they are often too universal and generic to be of much use for specific applications

  46. Hewlett-Packard Model for Defect Analysis Each defect is classified in terms of three factors: 1) Origin (phase of process where defect originated) • Specification/Requirements • Design • Coding • Environment and Support • Documentation • Operator or Other See Grady pp 127-129, and Leath (references)

  47. Typical Results from Origins Analysis

  48. Origins Analysis Weighted by Cost to Fix Defects (from HP) • Weighting Factors: • Specification 14.25 • Design 6.25 • Code 2.50 • Documentation 1.00 • Operator 1.00 • Other 1.00

  49. Hewlett-Packard Model (continued) 2) Type of Defect (what caused the problem) • Specification error • Functionality (impossible or incorrectly described) • Interface (hardware, software, or user; design or implementation) • Interprocess communication • Data definition or data handling error • Module design

  50. More Types of defects • More types of defects • Logic error or description error • Error checking • Standards specified or applied incorrectly • Computation error • Test error (hardware, software, process) • Integration error • Tools error

More Related