software reviews walkthroughs inspections and audits n.
Skip this Video
Loading SlideShow in 5 Seconds..
Software Reviews, Walkthroughs, Inspections, and Audits PowerPoint Presentation
Download Presentation
Software Reviews, Walkthroughs, Inspections, and Audits

Software Reviews, Walkthroughs, Inspections, and Audits

415 Vues Download Presentation
Télécharger la présentation

Software Reviews, Walkthroughs, Inspections, and Audits

- - - - - - - - - - - - - - - - - - - - - - - - - - - E N D - - - - - - - - - - - - - - - - - - - - - - - - - - -
Presentation Transcript

  1. Update terminology according Give references to O’Neil Software Reviews, Walkthroughs, Inspections, and Audits Better to find an error twice than never to find it at all. [Freedman/Weinberg 90] Freedman, D.P., G.M. Weinberg, "Handbook of Walkthroughs, Inspections, and Technical Reviews", Dorset House Publishing Co., Inc., 1990, pp. 89-161.

  2. Why Do We Have Formal Technical Reviews? • To err is human. • Lots of errors escape the originator more easily than anyone else. • Reviews are educational.

  3. Purpose: Provide • visibility into state of project • opportunities for personnel (project and non-project) to discuss topics related to project • intermediate milestones and sense of progress • assessment of technical adequacy of project

  4. Use the power of a team to … • Point out needed improvements of a product. • Confirm the parts in which improvement is not needed or desired. • Make technical work more uniform and predictable in quality, which makes it more manageable.

  5. Technical review is different from other reviews • For example, budget and management reviews • Technical review answers question: will this product do the job it’s supposed to do • If answer is “no”, then • no schedule is on time, and • no cost is cheap enough.

  6. When? • Single review that occurs on a particular date • Single review that occurs in response to a particular condition • Multiple reviews that occur periodically: e.g., monthly • Multiple reviews that occur in response to a particular condition, e.g., code review for subsystem

  7. Benefits • 10X reduction in errors in products • 50-80% cost reduction

  8. Why not just pass it around and have reviewers sign off …

  9. Why not just pass it around and have reviewers sign off … • Comments are too large • Comments are unread • Reviewers don’t have to take responsibility for the content.

  10. Requirements • Team participation • Documented procedures • Documented results

  11. Types of Reviews • Management Review • Technical Review • Walkthrough • Inspection • Audit

  12. Types of Reviews • Management Review • Evaluation of project-level plan or project progress • Monitors status of schedules and compliance to requirements • Technical Review • Walkthrough • Inspection • Audit

  13. Management Review • Objectives: • Inform management of project status • Resolve higher-level issues (management decisions) • Agree on risk mitigation strategies • Decisions • Identify corrective actions • Change resource allocation • Change project scope • Change schedule

  14. Types of Reviews • Management Review • Technical Review • Focus on specification and design • Determine whether products meet specs • Walkthrough • Inspection • Audit

  15. Scenario for a typical review • Leader: Welcome. We are here to review X for the purpose of Y. • Each person is introduced. • The recorder is introduced. • An agenda is presented and the review process explained. • The list of material in the review packet is stated. • The review begins…

  16. Review • Agenda set by leader, agreed to by committee • Review proceeds line-by-line. • Reviewers bring up issues as they encounter them. • Issues are added to an Issues List. • Agenda changes as issues are encountered.

  17. Round Robin Review • Force every member to participate. • Each person takes the lead for a section: • paragraph, • line of code, • function, • test case • Good way to get involvement from team.

  18. Types of Reviews • Management Review • Technical Review • Walkthrough • Used to find anomalies and evaluate conformance • Typically used for source code • Effective form of education • Inspection • Audit

  19. Walkthroughs • Producer guides review • Step by step presentation of product: • Code, design, report, test cases … • Can cover lots of material • Can have more people attend, less prepared • More work for presenter • May be difficult to control interactions • Too many interruptions • Always hard to have a producer present

  20. Walkthroughs • Participant should spend 1 hour preparing for each hour in meeting • Should be scheduled to be brief • Should only review completed code or document • Should require participants to sign report

  21. Types of Reviews • Management Review • Technical Review • Walkthrough • Inspection • More formal type of walkthrough • Used to identify specific errors or problems • Audit

  22. Inspections • Rapid evaluation of material with specific aspect in mind • Confine attention to one aspect only • Eg assume details are correct, but look for missing items at a global level • Eg look for occurrences of a particular kind of bug such as buffer overflow. • Selection of aspects is key point.

  23. Types of Reviews • Management Review • Technical Review • Walkthrough • Inspection • Audit • Independent evaluation of process or product • Ensures that the process is being followed

  24. Review Teams • Permanent or semi-permanent teams • IV&V or QA teams • Get to be very good at reviews • Need to have some way of reviewing the reviewers.

  25. The team • Leader • Job: obtain a good review or report why it was not possible • Reflects the quality of the review process, not the quality of the product • Must be accurate • Recorder • Provide info for accurate report of review • Short, public notes • Capture essence of each issue • Must ensure group has reached conclusion • Don’t video tape

  26. Tasks of leader • Monitor preparedness of team members • Set pace • Keep meeting on track • Poll members to reach consensus • Ensure participation • Has right to terminate review if it is unproductive • Disagreements • Fatal flaws

  27. Consensus • Decision of the group is equal to the most severe opinion of the group members. • Be conservative. • Accept the doubts of the most doubting member. • No “I told you so, but I got outvoted”.

  28. Reviewers • Must be qualified to contribute • Must be unbiased • Don’t invite management if it causes conflict • The point is to review the project, not the producers.

  29. Rules for Reviewers • Be prepared • Evaluate product, not people. • Watch your language. • One positive, one negative. • Raise issues, don’t resolve them. • Record all issues in public. • Stick to technical issues.

  30. Rules for Reviewers • If you find something, assume it’s a mistake. These are your peers, not your enemy. Remember people are involved • Avoid “why did you …” why didn’t you …” say instead “I don’t understand …” • “Why did you set the upper bound to 10 here?” • “I don’t understand why the upper bound is 10 here.” • seems trivial, but it’s not. • Do not get bogged down in style issues. For example, if efficiency isn’t an issue, then don’t make it one. • If it makes things less maintainable, that’s an issue. • If there are standards, then either stick to the standard, or dispose of the standard.

  31. Number of Reviewers • Ensure material is covered • Have enough • Don’t have too many • 3-7 is good • Count participants only • 3: ensures material is understood • Outsiders can be good. Must be unbiased

  32. Time • At most 2 hours at a time • Come back again if necessary • Depends on • Complexity • Size of project • Closeness to completion

  33. Everyone must prepare! • Review packet • Everything relevant to making correct judgment: code, specs, test data, results, standards, designs, … • 80% of failures stem from unprepared teams

  34. Tactics for participation • Devil’s Advocate • One person makes strongest case against a product • Job is to find fault • Needs to be an actor • Debugging • Producers leave known bugs in • Quality of review depends on how many found • Alarms • Time each person’s contribution, cut off • Stand-up Reviews • Talk as long as you stand on one leg

  35. Report • Goal: Does the product do the job its supposed to do? • Complete. • Correct. • Dependable as basis for other work • Measurable for purposes of tracking. • What was reviewed? • Signatures of reviewers. • Leader and recorder.

  36. Notes • Requires learning: to review and to write • Need to allocate time • Not a form of project management • But can provide information for tracking • Review early and often • But not too often

  37. Need for Reviews • Reviews help organization deliver quality product • Examine individual parts early in process • Identify problems spanning multiple items or phases • Reviews help identify problems early • Reviews help build image of the product and the vision of the product in the minds of the team members

  38. Cost of Software Reviews • Up-front cost for • Training • Staff preparation • Review Conduct • Code review may be 8-15% of total cost of project • Offset by savings: reduced rework later in project (25-35% of total project cost saved) • Raytheon: pre-1988: 43% of software cost in defect correction • 1994 (after software reviews instituted): 5% of project cost in defect correction

  39. Writing Issues • Groups of 3 • I’ll give you an issue from an issue list, • You tell me what’s wrong with it. • Issues (and responses) from Handbook of Walkthroughs, Inspections, and Technical Reviews, Daniel Freedman and Gerald Weinberg, 3rd Ed, Dorset House.

  40. Writing Issues #1 Some of the explanations of user commands were misinterpreted by members of the review committee.

  41. Comments on Issues #1 Not specific enough. Which explanations were misinterpreted? Don’t make the producers guess, or they may change the wrong thing.

  42. Writing Issues #2 A maximum of 10 values may be specified even though none of the standard system ABEND codes has been made not eligible.

  43. Comments on Issues #2 In writing, clarity isn’t the most important thing: it’s the only thing. If you don’t want to be not understood, don’t never use no double negatives. Another thing– what is the issue anyway? Is the maximum too low? Too high? Or is it that some of the codes should have been made not eligible? Or eligible? Be direct.

  44. Writing Issues #3 The referenced table of constant values was not part of the review packet.

  45. Comments on Issues #3 Always give the most direct reference available. There could be more than one table, now or in the future. Why make the producers search?

  46. Writing Issues #4 The method used for maintaining the message queue seems to solve a severe performance problem we’ve been having with the production version of the RKY system.

  47. Comments on Issues #4 This is an interesting observation, clearly and directly stated. But what does it have to do with the product under review? It might be worth millions, but it doesn’t belong on the Issues List for this product. It should go on the Related issues list.

  48. More issues #5 The price/performance table should be sorted using either Quicksort or Shell sort.

  49. Comments #5 Raise issues, but avoid all temptation to give advice in the Issues List. They won’t be welcome if they come in that form, so if you really must give advice, find some unofficial vehicle. Take the producers to lunch, or out for a beer, before you share your vast experience on matters of sorting. If your idea isn’t worth the price of a beer, why not forget it?

  50. More issues #6 The three diagrams drawn by Harold Mitter are not in the standard format (DS-109) required for such diagrams in our installation.