1 / 17

State-of-the-Art Software Inspections at NASA

This project is a collaboration between NASA JPL, GSFC, CSC, and FC-MD, funded by OSMA Software Assurance Research Program. The project aims to assess NASA lessons learned, provide training in effective inspection techniques, and run studies to measure the effectiveness of new inspection techniques.

Télécharger la présentation

State-of-the-Art Software Inspections at NASA

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. State-of-the-Art Software Inspections at NASA Forrest Shull & Fraunhofer Center for Experimental Software Patricia Larsen Engineering - Maryland Mike Stark, GSFC

  2. Outline • Introduction to this Initiative • Summary of year 1 work: Assessing NASA lessons learned • Current (year 2) work • Updating training course • Metrics collection plan • Controlled experiment • Preview of year 3 work and request for participation

  3. Project Information • This project • a collaboration between JPL, GSFC, CSC, and FC-MD • funded by the Office of Safety and Mission Assurance (OSMA) Software Assurance Research Program • Major objectives include: • A lessons learned report about current state-of-the-practice in inspections at NASA • Providing training in effective inspection techniques • Running of studies to provide support for new inspection techniques and measure their effectiveness 

  4. 1 6 M A G B I A S 1 4 1 2 1 0 S E A S A T I S E E B D E D E T G R O S I M A E M 8 D E A M A G S A T U A R S D S I M 6 D E B D E R B Y F O X P R O G O F O R G M A S U I A S P S M M G O E S A G S S A D E A S C O B E A G S S F D A S F W I N D D V P A S C O B S I M G R O A G S S 4 I S E E C W I N D P O P S E R B S B B X R T G O A D A G S O C T O N S I B M G R O S S U A R S T E L S U A R S A G S S D E S I M P O W I T S 2 SWASXTLS G R O D Y G O E S I M G R O H U D SOHOAGSS SOHOTELS C O B E D S E U V E D S I M SWASAGSS F A S T E L S E U V E A G S S S A M P E X XTEAGSS FASTAGSS TOMSAGSS E U V E T E L S S A M P E X T S TOMSTELS 0 1976 1978 1980 1982 1984 1986 1988 1990 1992 1994 1996 Why Inspections? Long-term benefits DevelopmentError Rates (1976 -1995) FORTRAN Projects Ada Projects Errors per KDLOC Project Midpoint GSFC SEL: Steady decrease in error rates. 85% improvement over 15 years.

  5. Support Quick Overview Individual Preparation Team Meeting Author Follow-up

  6. Types of Inspections Software inspections can be used to review: • System Requirements • System Design • Subsystem Requirements • Subsystem Design • Software Requirements • Architectural Design • Detailed Design • Source Code • Test Plans • Test Procedures & Functions • Operator’s Manual, Standards, Plans, Etc. (Source: JPL Formal Inspections training)

  7. Year 1: Lessons Learned Report • Based on interviews with a range of NASA projects from several Centers • 17 participants, across 5 Centers (and 7 sites) • Generally highly experienced personnel ( >7 years at current Center, > 10 inspections) • Investigated why people use (or decided not to use) inspections; what they inspect; how to introduce inspections effectively • Analyzed important issues; perceived benefits; problems and difficulties

  8. Year 1: Lessons Learned Report • Summary of results • Even informal inspection practices have benefits • More formal practices have more benefits, but require management support and resources • Developers rarely question the utility after their first successful inspection • State of the practice: • Fewer groups can afford full formality • Requirements and design reviews are most important and most frequently used • Used to achieve: • Communication (team cohesion, technical understanding) • Training (team building, cross-training) • Defect reduction

  9. Year 2 Work • Act on the lessons learned report by: • Creating updated training course to address issues • Developing a metrics collection plan to demonstrate ROI in context • Running controlled experiments to evaluate state-of-the-art practices that can be introduced at NASA

  10. Year 2: Updated training • JPL Formal Inspection training identified as highly beneficial from LLR • Working with GSFC’s FSB to adapt JPL materials to address current project constraints: • Modularization • Tailored to roles: moderators, inspectors, metrics collection • Shortened lecture, incorporating actual project work • Tailoring • Less JPL-specificity • Suggested agendas for various project constraints

  11. Year 2: Updated training • Incorporating lessons learned from synergistic research activities • UMCP: investigating learning curve of inspection practices • Measures of effectiveness vs. time • Applying “observational” empirical methods • Changes in process conformance over time • NSF: level of detail issues related to subject experience • Measures of effectiveness for different procedures • Interaction of detailed procedures with previous expertise

  12. Year 2: Updated training • CeBASE/NSF: collection, abstraction, and refinement of industrial data across domains, e.g.: • “Cost savings rule”: Cost to find & fix software defects is about 100x more expensive after delivery than in early lifecycle phases, for certain types of defects. • IBM: 117:1 between code and use • Toshiba: 137:1between pre- and post-shipment • Data Analysis Center for Software: 100:1 • “Inspection effectiveness rule”: Reviews and inspections find more than 50% of the defects in an artifact, regardless of the lifecycle phase applied. • 50-70% across many companies (Laitenberger) • 64% on large projects at Harris GCSD (Elliott) • 60% in PSP design/code reviews (Roy) • 50-95%, rising with increased discipline (O’Neill) • … many others Source: F. Shull et al., “What We Have Learned about Fighting Defects,” Proceedings of 8th International Metrics Software Metrics Symposium, Ottawa, Canada, June 2002.

  13. Year 2: Metrics plan • Worked with ST5 project to define feasible metrics set answering relevant questions • Application to FSB projects • Example Goal: Analyze software inspections for the purpose of evaluation with respect to number of defects reaching test phase. • Example Questions: Does introducing inspections in early phases… • …reduce the defect density of software at testing? • …reduce the overall testing effort? • …save more effort than it requires? • …result in less time to find and fix (types of) defects than if they were found through testing?

  14. Year 2: Controlled experiment • Running of short-term studies at JPL to evaluate candidate state-of-art practices • Controlled experiment between two teams with different approaches • As part of training class: before/after reviews • Re-inspection of artifact with defect history • Case study on live project

  15. Year 3: Tech transfer • Next year: transfer and dissemination of results • Delivery of train-the-trainers material • Measurement and data collection • Planning to turn over materials to • Projects at GSFC FSB, JPL, IVV directly • JPL SEPG • GSFC SEPG

  16. Request for Participation • We are looking for: • NASA projects & project leads to directly participate • Civil servants who are overseeing contracts to allow/encourage contractors to participate • Controlled experiments: Participants spend 1-2 days for: • Receiving training in state-of-the-art techniques that can be taken away and used on real projects. • Receiving feedback on types of defects detected and effectiveness of the training, and some comparison to their usual approach • Pilot studies: Participants work with us on actual projects and: • Receive training in state-of-the-art techniques tailored for their environment & project • Receive extended support for inspections including • data collection • consultation • analysis & feedback

  17. Contact Info • Contact • Forrest (lead researcher) • fshull@fc-md.umd.edu • 301-403-8970 • Mike (current government POC for contract) • michael.e.stark@gsfc.nasa.gov • 301-286-5048

More Related