1 / 30

Pop Quiz

Pop Quiz. What is an Orthogonal Defect? Which is more precise: predictive models or management models? Why? Essay: Why are metrics and models an important element to managing the quality of software development? T/F An evolutionary prototype is evolved rather than thrown away.

abby
Télécharger la présentation

Pop Quiz

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Pop Quiz • What is an Orthogonal Defect? • Which is more precise: predictive models or management models? Why? • Essay: Why are metrics and models an important element to managing the quality of software development? • T/F An evolutionary prototype is evolved rather than thrown away

  2. Software Quality EngineeringCS410 Class 10 Quality Management Models

  3. Quality Management Models • Quality Management Models • Tools for helping to monitor and manage the quality of software when it is under development • Goal - provide early warning signs so that timely improvement actions can be planned and implemented • Models must cover entire development phase (front-end and back-end) to be useful • Some Reliability Models can be used as Quality Management Models

  4. Quality Management Models • Rayleigh Model Framework • Based on the two assumptions: 1. Defect rate observed during development is positively correlated with defect rate observed in the field 2. Given the same error injection rate, if more defects are found and removed early then fewer will remain in later stages (I.e. the field) • “Do it right the first time” It is important to manage quality throughout the entire development process

  5. Quality Management Models 1. Best scenario is to prevent errors from being injected into the development process (and hence, into the product) 2. When errors are introduced, improve front-end processes (design reviews, code reviews, etc.) to remove as many of them, as early as possible. 3. Unit testing serves as the last chance to catch errors in the front-end process (the gatekeeper for front-end process escapes).

  6. Quality Management Models • The Rayleigh model is a good Quality Management model because it promotes defect prevention and early defect detection • As an in-process tool, the data can indicate the quality direction of the current project • Comparison to previous models would show whether the defect removal is more/less/same front-end loaded: same - quality should be similar to previous products more - quality should be better than previous products less - quality action should be taken

  7. Quality Management Models • Rayleigh curve shifts Left - more front-end loaded (early defect removal) Right - less front-end loaded Up - more error injection Down - less error injection (defect prevention) • Goal - shift the curve left and down as much as possible • Fig 9.2 p. 223

  8. Quality Management Models • Actions for shifting the curve • Left (early defect removal) • focus on design reviews / code inspections • moderator training • inspection checklists • in-process measurements to track effectiveness • mini-builds (prior to formal integration)

  9. Quality Management Models • Actions for shifting the curve • Down (defect prevention) • implementation of DPP • use of CASE tools • improved communications between interface owning groups

  10. Quality Management Models • Challenge - knowing exactly what the differences in the curve represent: • Lower defect removal rates (lower curve) could be less errors injected, or ineffective design reviews and code inspections • Higher defect removal (higher curve) could mean higher error injection or better design reviews and code inspections • Additional indicators are needed to help interpret the data

  11. Quality Management Models • Quality of process metrics can be used as additional indicators, for example: • Inspection effort (expressed in hours) combined with defect rate Inspection Effort and Defect Rate when compared to historical data

  12. Quality Management Models • Best case - high effort / low defects: Indicator that the design and/or code contained fewer defects, and that the design reviews and code inspections were effective, and better quality was ensured. • Good / Not Bad case - high effort / high defects: Indicator that error injection may be higher, but process was effective in detecting defects.

  13. Quality Management Models • Unsure case - low effort / low defects: Not sure whether less defects were injected, or whether less time spent in design reviews and code inspections turned up less defects. • Worst case - low effort / high defects: Indicator of high error injection, but process was not effective in detecting defects.

  14. Quality Management Models • PTR Sub-model • Continuous integration of some product life cycles make parametric models difficult. • Spiral and Iterative models make it difficult to differentiate between front-end and back-end processes. • These life cycles sometimes have the concept of continuous integration. • Different Quality management models are required for this type of product development life cycle.

  15. Quality Management Models • PTR (Problem Tracking Reports) are a common tool used for managing the change process during development and testing. • Non-parametric model that is a sub-model of the overall defect removal model. • PTR model spreads over time the number of defects that are expected to be removed during formal testing so that precise tracking may be done.

  16. Quality Management Models • PTR model is a function of: 1. Planned or actual lines of code integrated over time. Can be estimated from historical data, design documents, and estimating tools 2. Expected overall PTR rate per KLOC. Can be estimated from historical data. 3. PTR-surfacing pattern after the code is integrated. Depends on testing activities and integration plans (eg. weekly integration - faster discovery / fix / integration cycle than bi-weekly or monthly)

  17. Quality Management Models • Deriving PTR Sub-model Curve 1. Determine code integration plan 2. Multiple expected PTR rates times expected KLOCs to derive expected PTRs per integration 3. Spread PTRs over time depending on PTR spread pattern and sum of number of PTRs 4. Update model when integration plan changes or actual integration data becomes available 5. Plot the curve and track current project in terms of months until GA (General Availability)

  18. Quality Management Models • Example PTR Sub-model Fig 9.7 p. 229 • Note - PTR sub-model is non-parametric and therefore cannot make projections. It is used for tracking current data. • Goal - compared to the model, if the actual curve (actual defect arrivals) increases, and peaks earlier, then it will decline faster relative to the GA date (I.e. less defects will be shipped to the field)

  19. Quality Management Models • PTR Arrival/Backlog Projection Model • Goal - determine whether the scheduled code-freeze date can be met without sacrificing quality. In other words - will the PTR arrival rate and PTR backlog decrease to the acceptable levels? • Other models do not answer this question: • PTR Sub-model - cannot predict • Reliability Growth Models - predictions are made too late

  20. Quality Management Models • Model uses a general linear approach, and assumes that relevant variables observed over time perform an adequate prediction. • Predictor Variables: • Chronological Time - eg. weeks • Time Lag Variables - PTR mean time to resolution • Cumulative KLOC integrated - total of all integrations • Significant activities - eg. component test, system test, etc. PTR Arrival = constant + f(Week, Week2, Week3, KLOC, KLOC2, # Arrivals in preceding week, System Test) + e Example: Fig. 9.11 p. 235

  21. Quality Management Models • Reliability Growth Models • Models from previous products can be used to track the defect rates of the current product • To experience significant quality improvement, the current defect arrival rate must fall below the model curve • Comparing to a model allows for quality actions to be identified and implemented. • Models can be good for determining the end date of testing • Other models should be used in conjunction because the Reliability Growth Models do not focus on the front-end of the process

  22. Quality Management Models • Criteria for Model Evaluation • Most Important Criteria: • Timeliness of quality indication • “Raises a red flag” • Allows more time to react and recover • Scope of process coverage • Should address each phase • Should address quality of stage deliverables • Capability • Ability for model to provide information for planning and managing software development

  23. Quality Management Models • In-Process Metrics and Reports • A defect tracking and reporting system and a set of related in-process metrics is an important element in implementing Quality Management Models. • Especially important for large projects. • Multiple teams • Metric data and feedback needs to be available to all levels, and all teams

  24. Quality Management Models • Example - Using an Inspection Effort/Defect Rate report and matrix to access the rigor and effectiveness of the inspection process: • Fig. 9.14 p. 240 and Fig. 9.15 p. 242 • A tool to help measure process quality by tracking inspection effort and defects detected/removed

  25. Quality Management Models • Example - Inspection Report showing defect origin • Fig. 9.16 p. 245 • “Among the defects found by this stage, what is the percentage that should have been found by the previous stage?” • Does not require the total defect data like the DRE approach does. • This approach can be used as an in-process quality management tool

  26. Quality Management Models • Orthogonal Defect Classification • Orthogonal - pertaining to, or composed of right angels (Webster’s II) • In context: A set of mutually independent cause categories • ODC - A method for in-process quality management based on defect cause (or type) analysis, where a distribution of defect types is associated with process phases

  27. Quality Management Models • Concept: By examining the distribution of defect types, one can tell which development phase the current project is at (or should be at). • Eight Defect types: 1. Function (missing or incorrect functions) - Design Phase 2. Interface - low-level design 3. Checking - low-level design or code implementation

  28. Quality Management Models • Eight Defect types (cont.): 4. Assignment - Code Phase 5. Timing/Serialization - low-level design 6. Build/Package/Merge - library tools 7. Documentation - publications 8. Algorithms - low-level design

  29. Quality Management Models • Keys to ODC - look at the majority of defect types vs. where the project actually is. May be an indicator of an out of control condition. • Defect Trigger - a condition that allows (forces) a defect to surface. • Challenges for ODC: • What are defect triggers? • Can defects really be classified as orthogonal? • Overlap and interrelations between proposed defect types • Indirect method for accessing project progress

  30. Quality Management Models Summary • Quality Management Models, unlike reliability models, need to focus on: • Timeliness - for quality indications • Scope of coverage - all phases of development process • Capability - provides information (indicators and attributes) about quality • Tracking and reporting systems and sets of related in-process metrics are needed to make Quality Management Models work. • Defect cause analysis (eg. ODC) can lead to a better understanding of quality of the project.

More Related