1 / 40

When Peer Reviews Go Bad…

When Peer Reviews Go Bad…. Jim Kelly CEO, Zenkara. Overview Part 1. This presentation describes a three year long initiative which studied implementation of walkthroughs, reviews and inspections. Strategies for implementation together with analysis and lessons learned is given.

taffy
Télécharger la présentation

When Peer Reviews Go Bad…

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. When Peer Reviews Go Bad… Jim Kelly CEO, Zenkara

  2. Overview Part 1 • This presentation describes a three year long initiative which studied implementation of walkthroughs, reviews and inspections. • Strategies for implementation together with analysis and lessons learned is given. • Obstacles arose out of the implementation. The primary challenges are described together with what was done to address them.

  3. Overview Part 2 • Deployments of reviews are described and analysed in detail – what worked, what didn’t, and what was redone. • And finally some lessons learned are presented together with a discussion on formalising the detection & correction of defects with metrics embedded in the process.

  4. Scope The study involved software intensive organisations providing industry-specific software both within Australia and internationally. The reviews were implemented across the entire system project life cycle and across projects of between 5 and within a department of 50 staff.

  5. General Situations at T0 Some documents were reviewed informally. Little metrics data was collected, other than basic staff costs per team. Few staff had formal review training (other than at university). Few staff had some experience with formal reviews.

  6. Typical Implementation Strategy Big bang approach Incremental scope “Hot Spots” Priority: project or team Incremental formality Walkthroughs, Reviews, then inspections Decision on strategy usually made by Dev/Eng Managers

  7. Typical Implementation Motivation Critical to explain rationale to development/engineering staff To reduce defects/problems in documents, code, tests and other objects – NOT to measure performance Keep other key stakeholders in the loop

  8. Typical Implementation Method Identified key documents/deliverables to be reviewed Determined audiences and established level of formality required (walkthroughs or reviews or inspections” Basic metrics identified – effort, defects (inc type) Formalised existing reviews Added informal followup Created Call-for-Review and Review Log Training of dev teams and testers Introduced Review Closure form Introduced code reviews Introduced test script and other documents

  9. Ready for review Typical Review process Asks moderator Call for Review Reviewers assess and record on log Review held Review Closure (metrics) Author corrects, signs off and gets Closure signed Deliverable signed off

  10. Review process - Types of review 1. Walkthrough - presentation of a new object that few staff are familiar with. Seen as an exchange of information. 2. Review - detecting defects and discussing solutions. 3. Inspection - detecting defects only. Effectiveness Formality

  11. Review Process - Roles Moderator - also performs scribe, organiser, etc. Author Reviewer Quality Assurance

  12. Forms/Workflow Call For Review basic metrics (amount of time, size of deliverable to be reviewed) and recorded (form includes Completeness, Correctness, Traceability, Testability, Consistency, conformance, etc Review Log Form This form enabled all defects to be recorded BEFORE reviews take place. Closure Form Used to ensure loop is closed

  13. Issues with Forms/Workflow Forms were often combined into one Forms initially seen by moderators as expensivein terms of time to complete. Disagreements over simple classification of: Defect types Origin/Injection Points

  14. Use of Checklists Seen as useful by staff initially (and who were glad they didn’t have to actually tick the checklist items). In one instance, use of checklists tapered off after 3 months. This was rejuvenated through SEPG intervention, newsletters and refresher training for moderators. In another organisation, reviews degenerated into form filling activities – who can find the most spelling errors…

  15. Use of Checklists - Improvement Specific examples were added to assist in understanding Based on suggestions from staff, existing checklists were improved (by adding, changing items) . New checklists were created e.g. There was a specification checklist that was applied to both user requirements and software requirements reviews. A separate checklist was created for the user requirements, and the existing spec checklist modified accordingly.

  16. Role of the S/EPG or SQAG A distinguishing feature of the implementation was a lead role taken by the S/EPG (Software Engineering Process Group/ Enterprise Process Group) and/or Software Quality Assurance Group members. SEPG gradually withdrew from reviews of specific types of documents. SEPG members performed an SQA role from then on. Ownership of the process was gradually transferred to the software development manager. Roles: SEPG - Driving the process implementation SQA - Conformance to standards and implemented processes

  17. Management Commitment Managers provided verbal support – limited success Managers attended training sessions – limited success Managers attended reviews – some success – but limited to egalitarian/flat structure organisations Managers had their deliverables reviewed – significant motivator for staff - lead by example

  18. Management Commitment Another form of commitment: Re-estimated programs/projects to enable sufficient budget (time and money) for reviews.

  19. Use of metrics Metrics were used to guide both the conduct of the reviews and the effectiveness of the review process itself. Conduct Effort spent in reviewing document, and reviews themselves Common types of defects % Deliverables slipping through the process Effectiveness Customer Satisfaction – surveys and anecdotal % of defects detected later than injection phase

  20. Graphs A large number of graphs were also produced as part of the analysis of metrics collected (number and class of defects, size, effort expended). Initial use was to identify current state of the development, and to assess what needed immediate attention. # Reviews per month # Defects identified in each phase # Defects per document at T0 # Hrs of rework for each phase at T0

  21. Graphs - Initial Ramp-up between 10-18 months

  22. Graphs - Initial

  23. Analysis • Initial over-focus in certain phases • Overall cost reduction in moving defect correction earlier in life cycle • Overall customer satisfaction improvement through better deliverables making it to them. • Though initial negative impact – as latent defects were being identified – but not identified as positive result of reviews.

  24. Graphs - Subsequent Subsequent use was to provide guidance on what was working and what needed improvement, together with justification both internally and to the customer of the expense of reviews. % of Reviews by Doc type # Defects per document at T0 + 1 Year # Defects injected in each phase # Hrs of rework for each phase at T0+ 1 Year # Defects injected in each phase vrs identified at each phase

  25. Graphs - Impressions Management Initially interested in the number of reviews and workload required for reviews. Found the overall reduction in average defects per delivered module very useful - in terms of justifying cost of reviews to directors and to demonstrate commitment to improvement for customers.

  26. Graphs – Impressions Staff General concern about anonymity in most organisations. Demonstrated that no individuals could be identified - as data was only reported at a team level and organisation level. The Defects identified per phase and Defects per type of document graphs were used to identify improvements and to avoid common types of errors. Defects per phase useful (as a percentage) to start to estimate expected defect levels in future projects.

  27. Initial analysis - Findings A lot of effort and duration spent on rework Staff were not: using the checklists systematically using the perspectives Order of documents/deliverables needs to be well established. More training was needed Many documents sent to review too early - needed criteria

  28. Subsequent analysis of review process Staff were relying on reviews to find defects, and using them as a ‘catch-all’. Improvements to the requirements management and design processes were identified.

  29. Training Training was generally conducted on two levels: 1. A general overview of the review process 2. More in-depth training session and exercises on moderating/leading reviews. This training was essential during the early stages. Some organisations used buddy training for most engineering practices.

  30. Followup and refresher courses Just as important as the initial training was regular followup and refresher courses. These were to help ensure that omissions and mistakes were corrected with appropriate information. common mistakes collecting information This followup also took the form of regular memos and newsletters. why and strategies graphs war stories and benefits improvements

  31. Benefits of the reviews ROI of between 5:1 and 2:1 in the first year Identification of defects earlier Improved customer satisfaction through the communications/perceptions of a rigorous process Reduced common types of errors

  32. Benefits of the reviews Improved quality of product for each customer in the delivery chain Reduced rework after delivery Information exchange - product AND process

  33. Side effects of the reviews Participants identified the need for other processes and standards (to help improve the documents). Involved the customers much earlier than expected (customer was impressed with sophisticated level of reviews, and one customer actually wanted to implement the process at their own company). Introduced review metrics database which quickly evolved to include defect tracking, estimation and company’s Balanced ScoreCard.

  34. Keeping the ball rolling Set coverage targets of 20%, 50%, 80%. Initially include participation in reviews as appraisal criteria. Then after 6 months and the process in bedded down, include team coverage targets in appraisal criteria. Link an award to the highest coverage rate.

  35. Keeping the ball rolling Ensure someone monitors the rate and performs refresher when needed. Often SEPG/SQAG but more effective when performed by team leaders. Make it FUN! Get the participants to review the review process itself Get the developers to review management material

  36. Strategy and method Begin with the end in mind - what goal do you want to achieve using reviews? Ensure that reviews are not seen as an excuse to shift responsibility. Don’t have too many meetings Hold light peer reviews (without the call-for-review) Try different approaches to the review process (tailor for the organisation) and to the implementation of the process.

  37. Outsourcing and Cultural Issues Need careful adjustment of review processes due to multi-national teams being formed. Need to express reviews in different perspectives. Review process provides visibility into the processes at outsourcing partners

  38. Metrics Start collecting metrics as soon as possible. Provide guidance figures based on metrics as soon as possible. Average of pages/hr - individual review Average of pages/hr - group review ‘Normal’ data - expected defects per page/KLOC

  39. About – Jim Kelly Jim Kelly is CEO of Red Onyx Systems. He has extensive experience in process improvement and software engineering. He has consulted for over twenty years across software intensive industries including aerospace, defence, engineering, utilities and finance. Jim has presented at numerous conferences and workshops in areas such as business improvement, software metrics, risk management and quality. He has served on industry boards such as ASMA, SQA, and was involved in editing the ISO9126 Software Product Quality standard. Jim holds a Bachelors degree in Informatics, a GradDip in Software Quality and an MPhil in Software Quality. He can be contacted at jim@redonyx.com.au

  40. Who are we? Zenkara provides advanced technology for mission-critical software engineering companies. Head Office: 2/27 Waverley Street Annerley Qld 4064 Australia PO Box 1622 Milton QLD 4064 Mobile:+61 (0)410 616125 | direct: +61 (0) 7 3892 6771 email: jim@zenkara.com web: http://www.zenkara.com ABN: 85 050 409 660

More Related