1 / 27

Formal Inspection and Peer Review Process Training

Formal Inspection and Peer Review Process Training. Process for performing formal inspections and peer reviews of software artifacts. Agenda. Software Processes. The SEPG has developed the following processes: GRC-SW-7150.3 - Software Project Planning

Télécharger la présentation

Formal Inspection and Peer Review Process Training

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Formal Inspection and Peer Review Process Training Process for performing formal inspections and peer reviews of software artifacts

  2. Agenda

  3. Software Processes • The SEPG has developed the following processes: • GRC-SW-7150.3 - Software Project Planning • GRC-SW-7150.4 - Software Project Monitoring and Control • GRC-SW-7150.5 - Requirements Development • GRC-SW-7150.6 - Requirements Management • GRC-SW-7150.8 – Software Measurement • GRC-SW-7150.9 - Software Configuration Management • GRC-SW-7150.10 - Transition SW to a Higher Class • GRC-SW-7150.12 - Formal Inspection and Peer Review • GRC-SW-7150.14 - Software Acquisition SOW Guideline (being updated) • GRC-SW-7150.15 - Software Acquisition Planning • GRC-SW-7150.16 - Software Estimating • GRC-SW-7150.17 - Software Safety Planning • These are available on Software@Glenn.

  4. Software Documentation Templates • The SEPG has developed the following document templates: • GRC-SW-TPLT-SMP - Software Management Plan Template • GRC-SW-TPLT-SRS - Software Requirements Specification Template • GRC-SW-TPLT-RTM - Requirements Traceability Matrix Template • GRC-SW-TPLT-SCMP - Software Configuration Management Plan Template • Software Change Request/Problem Report • GRC-SW-TPLT-MMS – Master Metrics Spreadsheet • GRC-SW-TPLT-SDD - Software Design Description Template • GRC-SW-TPLT-IDD - Interface Design Description Template • GRC-SW-TPLT-STP - Software Test Plan Template • GRC-SW-TPLT-STPr - Software Test Procedure Template • GRC-SW-TPLT-STR - Software Test Report Template • GRC-SW-TPLT-SVD - Software Version Description Template • GRC-SW-TPLT-SUM - Software Users Manual Template • GRC-SW-TPLT-SMntP - Software Maintenance Plan Template • GRC-SW-TPLT-SSP- Software Safety Plan Template • GRC-SW-TPLT-SSCA - Software Safety Criticality Assessment (new) • These are available on Software@Glenn. • They are Word documents except for the Master Metrics SS.

  5. Formal Inspection and Peer Review • This process was developed to facilitate verification and validation by inspections and reviews required by NPR 7150.2B, NASA Software Engineering Requirements and assist in software measurement. • The formal inspection process is described in more detail in NASA-STD-8739.9, NASA Software Formal Inspection Standard. • Use this process to • find and eliminate defects in project artifacts early in the development process • Any artifact may be inspected or reviewed. • Implements NPR 7150.2B SWE-030, SWE-031, SWE-061, SWE-069, SWE-088, SWE-089 • Implements CMMI’s Verification Specific Practices: 2.1, 2.2, 2.3

  6. Formal Inspection and Peer Review

  7. Formal Inspection and Peer Review Plan the inspection(6.1) • The Moderator plans the inspection by: • Identifying the participants. See NASA-STD-8739.9 for guidance on selecting the participants.  • Arranging for a meeting time, location, and facility. • Setting up training, if it is needed, for the participants. • Determining whether the product to be inspected is of reasonable size, so that the inspection can be completed in one meeting. If not, divide the product into manageable pieces. • Ensuring preparation of checklists for the reviewers and outlining key issues to guide the inspection. • Setting up a tool, if used. • Recording the amount of time spent planning the inspection on the Inspection Summary Sheet.

  8. Formal Inspection and Peer Review Verify artifact meets readiness criteria(6.2) • The Moderator checks artifact against readiness criteria to determine if it is ready for an inspection.  • The Moderator defines readiness criteria for the artifact, using project-established criteria if available. • See NASA-STD-8739.9 for guidance on readiness criteria. • If the artifact is not ready, the Moderator stops the process and works with the Author to rework the artifact.

  9. Formal Inspection and Peer Review Provide training in formal inspections(6.3) • The Moderator arranges training for any of the participants who are new to formal inspections or are unfamiliar with any of the inspection tools.

  10. Formal Inspection and Peer Review Conduct overview meeting(6.4) •  The Moderator and Author hold an overview meeting to familiarize the participants with the intent of the inspection, to assign roles, and to schedule the inspection meeting.  • If using a Reader, the Reader leads the other inspectors through the product under review and any related material in a logical progression, paraphrasing and summarizing each section. • Participants enter the time spent in the meeting in the Inspection Summary Sheet. • The Moderator ensures that all materials to be reviewed, as well as checklists and any other guidance, are made available to the reviewers with sufficient lead time (1 week is recommended) to ensure adequate preparation for the review. • Note: The overview meeting is optional for peer reviews. If included, this meeting may be conducted virtually via e-mail or held as part of another meeting.

  11. Formal Inspection and Peer Review Prepare for inspection meeting(6.5) • Inspectors perform the following: • Review the artifact against a checklist. See Appendix C in this document for sample checklists. Also review the artifact against other appropriate references such as • Coding standards for source code inspections • Project documentation templates for project documents • Other internal or external standards • Record defects and open issues in the inspection log during preparation for the inspection. • Record the amount of time spent inspecting the artifact. • Inform the moderator when done and whether or not the artifact is ready for an inspection.

  12. Formal Inspection and Peer Review Determine readiness for inspection(6.6) • The Moderator combines all inspection logs into one defect list for the inspection. • The Moderator checks readiness criteria to determine if the artifact and participants are ready for inspection.  • If an Inspector is not ready, the Moderator either reschedules the inspection meeting (preferred) or removes the Inspector from the inspection. • If the artifact is not ready for an inspection, the Moderator stops the process, informs the team, and works with the Author to rework the artifact.

  13. Formal Inspection and Peer Review Conduct inspection meeting(6.7) • The Moderator holds inspection meeting.  • The Author and team reach a consensus on the severity of each defect (major, minor, or editorial) and whether to accept or reject defects. • Open issues are closed, identified as defects, or held open for further research. • Defects are identified, not resolved, in this meeting. • The Recorder records results of the meeting on the defect list. • Each participant records the amount of time spent in the meeting on the Inspection Summary Sheet. • The duration of the inspection meeting should be kept to 2 hours or less. • Asecond inspection meeting should be scheduled to complete the inspection.

  14. Formal Inspection and Peer Review Conduct third-hour meeting(6.8) • The Moderator holds optional third-hour meetings (as many as necessary) to identify potential solutions and resolve any open issues that occurred during the inspection. • Participants may be a subset of the inspection team and include relevant managers and technical experts. • The Moderator transfers issues with other project products to the project’s issue-tracking system.  • The Recorder enters results in the defect list. • Participants record the amount of time spent in this meeting in the Inspection Summary Sheet.

  15. Formal Inspection and Peer Review Correct defects(6.9) • The Author corrects the accepted defects, updating the defect list and Inspection Summary Sheet as corrections are made. • The Author records time spent making the corrections on the Inspection Summary Sheet.

  16. Formal Inspection and Peer Review Conduct follow-up meeting(6.10) • The Moderator holds a follow-up meeting with the Author to verify that defects have been properly corrected. Moderator or Recorder updates defect list as corrections are verified. Moderator checks that exit criteria have been met.  Moderator defines exit criteria for the artifact, using project-established criteria if available. See NASA-STD-8739.9 for guidance on exit criteria. • Note: The follow-up meeting is optional for peer reviews and may be conducted virtually via e-mail or held as part of another meeting.  SWE-088

  17. Formal Inspection and Peer Review Generate report(6.11) • The Moderator or Recorder ensures that all data are collected and complete. The Moderator or Recorder generates a report on the formal inspection that includes the following: • Identification information (including artifact being inspected, size of artifact, inspection type (e.g., requirements inspection, code inspection, etc.), and inspection time and date). • Summary of total time expended on each inspection/peer review (including total hour summary and time participants spent reviewing the artifact individually). Include time spent in each meeting (overview, inspection, and third-hour) and in correcting each defect found. • Participant information (including total number of participants and area of expertise for each participant). • Total number of defects found (including the total number of major defects, total number of minor defects, and the number of defects in each type (such as accuracy, consistency, completeness, etc.)). • Inspection results summary (i.e., pass or reinspection required). • Listing of all inspection defects.  • Reports may simply be the collection of inspection artifacts (i.e., participant data, defects, etc.). • Generating a report is optional for peer reviews.  

  18. Formal Inspection and Peer Review Deliver report(6.12) • The Moderator delivers the report to the project per the project plan. • The Moderator archives the report.

  19. Using the eRoom tool for formal inspections • The eRoom tool has been developed for use by the SEPG and the Flight Software Branch. • Projects outside the branch are welcome to use it • https://collaboration.grc.nasa.gov/eRoom • Contact KWI-Support@lists.nasa.gov, 216-433-9702

  20. Software Measurement ProcessResources, Tools, and Templates • Available from Software@Glenn: • http://software.grc.nasa.gov • GRC-SW-7150.12: Formal Inspection and Peer Review Process • NPR 7150.2B: NASA Software Engineering Requirements • GRC eRoom Inspection Tool • Web-based tool using eRoom’s database capabilities • Available from the Agency • NASA Engineering Network: Software Engineering Community of Practice • https://nen.nasa.gov/web/nen/home: Click on Communities->Software Engineering • NASA Software Engineering Handbook • https://swehb.nasa.gov/

  21. Feedback on the Formal Inspection and Peer Review • Processes, checklists, templates and forms available at Software@Glenn: http://software.grc.nasa.gov • To ask questions • Lisa Lambert, x3994 • grc-sepg-lead@lists.nasa.gov • We value your feedback • The feedback form is on the Feedback page of Software@Glenn. • Send the feedback form to grc-sepg-lead@lists.nasa.gov. • Group feedback sessions available upon request. • Based on feedback, the process will be updated. • The updated process will be made available on Software@Glenn. • Share your products as examples for future projects!

  22. Questions?

  23. Backup

  24. CMMI v1.3 Verification Requirements • Specific Goals and Practices • SG 1 – Prepare for Verification • SP 1.1 – Select work products for verification • SP 1.2 – Establish the verification environment • SP 1.3 – Establish verification procedures and criteria • SG 2 – Perform Peer Reviews • SP 2.1 – Prepare for peer reviews • SP 2.2 – Conduct peer reviews • SP 2.3 – Analyze peer review data • SG 3 – Verify Selected Work Products • SP 3.1 – Perform verification • SP 3.2 –Analyze verification results

  25. CMMI v1.3 Generic Goal 2 • Generic Goal 2 and Practices • Goal 2 – Institutionalize a managed process • GP 2.1 – Establish an organizational policy • GP 2.2 – Plan the process • GP 2.3 – Provide resources • GP 2.4 – Assign responsibility • GP 2.5 – Train people • GP 2.6 – Control work products • GP 2.7 – Identify and involve relevant stakeholders • GP 2.8 – Monitor and control the process • GP 2.9 – Objectively evaluate adherence • GP 2.10 – Review status with higher level management • 25

  26. NPR 7150.2B RequirementsSWEs: 030, 031, 061, 069, 088, 089 • SWE-030:The project manager shall record, address, and track to closure the results of software verification activities. • SWE-031:The project manager shall record, address, and track to closure the results of software validation activities. • SWE-061: The project manager shall select, adhere to, and verify software coding methods, standards, and/or criteria. • SWE-069: The project manager shall record defects identified during testing and track to closure. • SWE-088: The project manager shall, for each planned software peer review or software inspection: • a. Use a checklist or formal reading technique (e.g., perspective based reading) to evaluate the work products. • b. Use established readiness and completion criteria. • c. Track actions identified in the reviews until they are resolved. • d. Identify required participants. • SWE-089: The project manager shall, for each planned software peer review or software inspection, record basic measurements.

  27. NPR 7150.2B Requirements(cont.)SWEs: 147, 149, 155, 157 • SWE-147: The project manager shall specify reusability requirements that apply to its software development activities to enable future reuse of the software, including models used to generate the software. • SWE-149: The project manager shall ensure that when an OSS component is acquired or used, the following conditions are satisfied: • A. The requirements that are to be met by the software component are identified. • b. The software component includes documentation to fulfill its intended purpose (e.g., usage instructions). • c. Proprietary, usage, ownership, warranty, licensing rights, and transfer rights have been addressed. • d. Future support for the software product is planned and adequate for project needs. • e. The software component is verified and validated to the same level required to accept a similar developed software component for its intended use. • SWE-155: The project manager shall implement the identified software security risk mitigations addressed in the Project Protection Plan. • SWE-157: The project manager shall ensure that software systems with space communications capabilities are protected against un-authorized access.

More Related