1 / 59

Software Independent Verification and Validation (IV&V): An Agency Overview

Software Independent Verification and Validation (IV&V): An Agency Overview. Kenneth A Costello IV&V Program Lead Engineer. GSFC Systems Engineering Seminar Series 12 Sep 2006. Agenda. A Quick IV&V Facility/Program History The “Software Crisis” IV&V/NASA IV&V Forming an IV&V Project

bisa
Télécharger la présentation

Software Independent Verification and Validation (IV&V): An Agency Overview

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Software Independent Verification and Validation (IV&V): An Agency Overview Kenneth A Costello IV&V Program Lead Engineer GSFC Systems Engineering Seminar Series 12 Sep 2006

  2. Agenda • A Quick IV&V Facility/Program History • The “Software Crisis” • IV&V/NASA IV&V • Forming an IV&V Project • IV&V Relationships • Closing

  3. 10/91: Grant provided to WVU to build IV&V Facility assigned to HQ-OSMA Setting the stage: A History 05/03 NASA Executive Council makes IV&V an Agency OSMA Program 1996: Facility Omnibus contract: Enabled IV&V across all NASA Projects 06/99: Senior Management Council: IV&V mandate for all NASA software 04/94: Space Station Program Implements IV&V through Facility 10/03 IV&V Funding changed to Corporate G&A 05/88: Space Shuttle Program Implements IV&V 1991 1993 1996 1999 2004 1988 focus=research 08/01 NPD 8730.4: Software IV&V Policy 08/05 NPD 2820.1: Software Policy focus=IV&V 04/96: Facility transitioned to AMES Research Center 07/00: Facility transitioned to Goddard Space Flight Center 42 36 37 00 96 91 26 24 20 15 12 IV&V, IA Projects 1 2-3

  4. Setting the Stage: An Agency Requirement • NPD 8730.4 SW IV&V Policy • Cancelled on 08/30/05 • Current Requirements • NPD 2820.1C Software Policy • NPR 7150.2 Software Engineering Requirements • NASA-STD-8739.8 Software Assurance

  5. NPD 2820.1C Software Policy • NASA policy regarding software activities for each project is to accomplish the following: (5) Projects shall ensure software providers allow access to software and associated artifacts to enable insight/oversight by software engineering and software assurance which includes Independent Verification and Validation (IV&V) and NASA's Safety and Mission Assurance organizations. c. Use the NASA IV&V Facility as the sole provider of IV&V services when software created by or for NASA is selected for IV&V by the NASA Chief Safety and Mission Assurance Officer. • Responsibilities c. The NASA Chief Safety and Mission Assurance Officer shall: (1) … (6) Oversee the functional management of the NASA IV&V Program and assure the performance of all of IV&V processes, services, and activities. (7) Establish and manage processes for the selection of software to which to apply IV&V. (8) Charter the IV&V Board of Directors (IBD) which makes prioritized recommendations for allocating IV&V services to projects based on the annual Software Inventory (maintained by the Chief Engineer) and the Office of Safety and Mission Assurance(OSMA) defined process. (9) Select and maintain the list of software projects to which IV&V is to be applied. (10)… d. The IV&V Program Manager shall 1) establish and manage the Agency's software IV&V services and procedures; 2) establish, maintain, and report on the results of IV&V services and findings; and 3) support NASA's program for improving software assurance and other trusted verifications (e.g., independent assessments, peer reviews, and research). The IV&V Facility shall determine and document the services provided by the Facility on projects selected for IV&V by the NASA Chief Safety and Mission Assurance Officer.

  6. NPR 7150.2 Software Engineering Requirements • Section 5.1.1.1 states required content for SW Development Plans. • “The Software Development or Mgmt Plan shall contain: [SWE-102] • a. Project organizational structure showing authority and responsibility of each organizational unit, including external organizations (i.e., Safety and Mission Assurance, Independent Verification and Validation (IV&V), Independent Technical Authority (ITA), NASA Engineering and Safety Center (NESC))". • Additionally, within section 5.1.5 which addresses SW Assurance, it states: "The SW Assurance Plan details the procedures, reviews, and audits required to accomplish software assurance. The project office should coordinate, document, and gain concurrence with the Office of Safety and Mission Assurance as to the extent and responsibilities of the assurance and safety of the project. This will be documented into the project plans and reflected in the assurance process. • Section 5.1.5.1 states “The SW Assurance Plan(s) shall be written per NASA-STD-8739.8, NASA SW Assurance Standard. [SWE-106]". 

  7. NASA-STD-8739.8 Software Assurance • Std states the following: • Section 6.1.4 When IV&V has been selected for a project, the provider shall coordinate with IV&V personnel to share data and information. • Section 7.5.3 When the IV&V function is required, the provider shall provide all required information to NASA IV&V Facility personnel. (This requirement includes specifying on the contracts and subcontracts, IV&V’s access to system and software products and personnel.)

  8. Independent Verification and Validation: The NASA Approach A Software Crisis

  9. Growing Software Importance • Fundamental Concern: • First NASA robotic mission with actual software launched in 1969 (Mariner 6) • Software size has grown over time • 128 words of assembly; equivalent 30 lines of C code • MER has about 600,000 lines of equivalent C code • More functionality is being placed within software and software constructed devices (Programmable Logic Devices) • With increased processing power and memory, more tasks are running concurrently • Control software increasing in complexity and size • Software used to monitor and react to hardware faults

  10. Software is still hard to get right • The Carnegie Mellon Software Engineering Institute reports(1) that at least 42-50 percent of software defects originate in the requirements phase. • The Defense Acquisition University Program Manager Magazine(2) reports in a Department of Defense study that over 50 percent of all software errors originate in the requirements phase. 1 – Carnegie Mellon Software Engineering Institute, The Business Case for Requirements Engineering, RE’ 2003, 12 September 2003 2 - Defense Acquisition University Program Manager Magazine, Nov-Dec 1999, Curing the Software Requirements and Cost Estimating Blues

  11. Fixing errors early can conserve resources • Early error detection and correction are vital to development success • The cost to correct software errors multiplies during the software development lifecycle • Early error detection and correction reduces cost and saves time • IV&V assurance vital to mission success • Independent evaluation of critical software is value-needed • Agency goal Average relative costs for finding errors late: "Software Engineering Economics" by Barry Boehm

  12. Overview of Defects found by IV&V Teams

  13. Independent Verification and Validation: The NASA Approach Independent Verification and Validation

  14. What is Verification and Validation? • Simply put, assuring that a software system meets the user’s needs • Verifying that the software is accurate and representative of its specification • Validating that the software will do what the user really wants it to do

  15. What is up with that I? • I = Independent • Financially: Funded from Corporate G&A for Agency identified high priority Projects • Customer Project may also fund the effort • Technical: IV&V program defines scope and tasks, tailored by an IV&V criticality assessment • Uses a predefined work breakdown structure • Managerial: Functional management supplied by OSMA • Project management supplied from the IV&V program

  16. So what is IV&V? • An engineering discipline employing rigorous methods for evaluating the correctness and quality of the software product throughout the software life cycle from a system level viewpoint. • The NASA Software IV&V approach covers not only expected operating conditions but the full spectrum of the system and its interfaces in the face of unexpected operating conditions or inputs. 

  17. So what else is IV&V? • Testing at the end of the life cycle? • No • IV&V is testing, but it is whole life cycle testing • The IV&V team “tests” artifacts ranging from system and software requirements to source code and test results • Each task in the IV&V WBS is designed to “test” a development artifact or process

  18. What are the objectives of IV&V? • Find defects within the system with a focus on software and its interactions with the system • Make an assessment of whether or not the system is usable in an operational environment, again with a focus on the software within the system • Identify any latent risks associated with the software

  19. What is the goal of IV&V? • Note that the software may not be free from defects • Rarely the case and difficult to prove • The software must be good enough for its intended use • As described by the requirements • Correct requirements • The type of use will determine the level of confidence that is needed • Consequence of software defect/failure Establish confidence that the software is fit for its purpose within the context of the system

  20. Are there any other benefits to IV&V? • Primary purpose to provide confidence to OSMA, however... • Development projects receive all findings • The good, the bad, the ugly • Allows PM to have unbiased view of software development effort • Provides knowledge resource for software developers • In phase IV&V work provides early error detection and may save the project money in error correction

  21. Program processes Software schedules, development tracking, critical path analysis, configuration mgmt Ancillary developments Simulations, trainers, test environments Increased probability of success - Good processes allow early error identification and correction - Quality documentation enhances software maintenance IV&V is process as well as product oriented

  22. Program Identification of top risks Eval of Program Devel status Eval of Program Schedule status Status Reviews Status Reviews Status Reviews Status Reviews Phase complete analysis report IV&V Reqts Week Design Week Week Month Week IV&V is a program level “tool” to efficiently and effectively manage software development risk. IV&V Increases Project Awareness

  23. IV&V Interfaces and Reporting • Formal and informal interface with developers • The formal interface with an IV&V Program project manager • Informal interface between the IV&V analysts and the developers • Helps to get identified problems and issues into the appropriate hands quickly • Results of the effort thoroughly documented • Issues identified to the developers in a timely manner • Status reports to the Project Management • Monthly/Quarterly reviews to GPMCs/Directorates/HQs • Project close out report • All inputs and outputs archived • Final report delivered to project for its own internal records • Lessons learned documented throughout

  24. NASA IV&V

  25. Agency Generic IV&V Scoping and Costing Flow

  26. The IV&V Life Cycle • An IV&V Project follows a life cycle similar to most Projects • Formulation • Execution • Close-out

  27. Formulation Phase • The Formulation phase is used to plan and scope the work to be performed • Starts usually prior to System Requirements Review (SRR) with initial planning and contact with the Project • Generally between SRR and Preliminary Design Review (PDR) planning and scoping process is executed • Criticality analysis developed as foundation for the IV&V effort on the project • The effort addresses all of the software on a Project • The process generates a tailored approach based on the results of the assessment

  28. Execution Phase • Majority of the IV&V effort is performed • Documented in an IV&V Plan (IVVP) that is an output of the Formulation work • The IVVP is provided to the Project for review and applicable concurrence • Approach taken from the WBS and tailored based on the results of the Formulation work • The Execution phase generally ends somewhere around or shortly after launch • In some cases, work may extend beyond launch when software is still being developed (MER)

  29. IV&V WBS for NASA Missions • The purposes of the IV&V Work Breakdown Structure are to • Provide a consistent approach to IV&V across the Agency • Provide a consistent and comprehensive basis for collection and reporting of metrics • Help Projects anticipate and understand what IV&V will do • The IV&V WBS was developed using industry standards and IV&V history on NASA missions as reference • IEEE Std. 1012-2004 IEEE Standard for Software Verification and Validation • IEEE/EIA 12207.0-1996 Standard for Information Technology-Software life cycle processes • WBS Tasks for NASA Missions • Task selection is based on an algorithm using software development risk • Risk is generated based various on Project characteristics (size, complexity, reuse, risk, etc.) as part of IV&V planning and criticality analysis tasks • The full WBS can be found at http://ims.ivv.nasa.gov/isodocs/IVV_09-1.pdf

  30. IV&V Activities Fit within the Project Schedule Mission Readiness Review System Retirement System Requirements Review Preliminary Design Review System Test Critical Design Review S/W FQT Launch Initial IVVP Signed Baseline IVVP Signed • - IV&V provides support and reports for Project milestones • Technical Analysis Reports document major phases • IVVP is updated to match changes in Project IV&V Provides CoFR IV&V Final Report Concept Phase 2.0 Requirements Phase 3.0 Design Phase 4.0 Implementation Phase 5.0 Test Phase 6.0 Operations & Maintenance Phase 7.0 IV&V Phase Independent Support 1.0 Note: numbers correspond to IV&V WBS • Designed to mesh with the Project schedule and provide timely inputs to mitigate risk • Dialog between the IV&V Facility and the Project begins before SRR

  31. Close Out Phase • The Close Out phase concludes the IV&V effort • All of the work performed is summarized in a final technical report • Additionally, Lessons Learned are captured and either documented separately or incorporated into the final technical report • In some cases, the IV&V Team is retained to provide mission support during critical phases of the Project which may occur after Close Out of the primary effort

  32. The IV&V Life Cycle Flow Concept Phase Focused activity at the earliest point System requirements and software role important Issues are introduced at lowest level Verification Covers all levels of testing. Ensures that system meets the needs of the mission System Requirements Software Planning Verification Verification IV&V in phase with development Validation Testing Software Requirements Verification Design Verification Simulator/ Environment/ Hardware Implementation Verification Maintenance Later life cycle activity also important Issues are still introduced at lowest level Focused more on individual components IV&V support continues over initial operational phase and beyond based on mission profile

  33. IV&V Testing Philosophy • Most testing is designed to show the software works within the envelope of the mission (Test what you fly, fly what you test) • The IV&V approach is to focus more on off-nominal and unexpected situations in the software • The higher the level of confidence needed the deeper the analysis • The guiding goal is not necessarily to perform additional testing • The goal is to improve the Project's test planning and execution • In some cases, IV&V may independently test highly critical software [ Component Based Testing ] [ Integration ] [ System Testing ] [ Acceptance Testing ] Unit Test (CSC, CSCI) S/W Integration S/W Functional Qualification Testing System Integration and Test Acceptance Testing

  34. Independent Verification and Validation: The NASA Approach Forming an IV&V Project

  35. IV&V Project Requirements: Background • Critical first steps is to develop the requirements for the IV&V project • A set of engineering/management tasks that are determined through a criticality analysis process • Previously accomplished individually by different NASA contractors using different processes • This sometimes led to confusion with the NASA development projects as there was little consistency • There was also a mixture of terminology used that was sometimes in conflict with other NASA terminology and industry standard terminology • There was also a perception among some parts of NASA that the IV&V contractors were determining their own work

  36. Software Integrity Level Assessment Process • To help mitigate or eliminate some of these issues the IV&V Program undertook an initiative to develop a new process • Examined the best of current criticality analysis processes from industry and academia • The primary objective of the process is to develop the requirements for an IV&V project

  37. SILAP Goals

  38. Software Integrity Level (SIL) • Software Integrity Levels • Want to define, for a software component, the required level of integrity in terms of its role in the system • Understand how the component fits within the system • Understand what is required of that component to be able to maintain the functionality of the system

  39. Software Integrity Level: Definition • Definition of Software Integrity Level • A range of values that represent software complexity, criticality, risk, safety level, security level, desired performance, reliability, or other project-unique characteristics that define the importance of the software to the user and acquirer • The characteristics used to determine software integrity level vary depending on the intended application and use of the system. • A software component can be associated with risk because • a failure (or defect) can lead to a threat, or • its functionality includes mitigation of consequences of initiating events in the system’s environment that can lead to a threat • Developed using not only software but also system level integrity as a basis (ISO/IEC 15026, 6)

  40. Risk: A Common Denominator • Previously development projects (IV&V stakeholders) could not easily link risk with the scoring that was performed • Prime requirement for this new process is that it clearly defined the system risk and is linked to the software • Process was built around two project factors the combination of which would define some level of system risk linked to the software • The factors are Consequence and Error Potential

  41. Consequence vs. Error Potential • Consequence is a measure of the system level impact of an error in a software component • Generally, take the worse case error (at the software component level) that has a reasonable or credible fault/failure scenario • Then consider the system architecture and try to understand how that software fault/failure scenario may affect the system • Error Potential is a measure of the probability that the developer may insert an error into the software component • An error is a defect in the human thought process • A fault is a concrete manifestation of errors within the software • A failure is a departure of the system behavior from the requirements • With these definitions in mind, the approach is not to assess faults or failures, but to assess errors • Scoring

  42. Consequence • Consequence consists of the following items • Human Safety – This is a measure of the impact that a failure of this component would have on human life • Asset Safety – This is a measure of the impact that a failure would have on hardware • Performance – This is a measure of the impact that a failure would have on a mission being able to meet its goals

  43. Error Potential • Error Potential consists of the following items • Developer Characteristics • Experience – This is a measure of the system developer’s experience in developing similar systems • Organization – This is a measure of the complexity of the organization developing the system (distance and number of organizations involved tend to increase the probability of errors being introduced into the system) • Software/System Characteristics • Complexity – This is a measure of the complexity of the software being developed • Degree of Innovation – This is a measure of the level of innovation needed in order to develop this system/software • System Size – This is a measurement of the size of the system in terms of the software (i.e., Source Lines of Code)

  44. Error Potential (2) • Development Process Characteristics • Formality of the Process – This is a measure of how maturity of the developer’s processes • Re-use Approach – This is a measure of the level of re-use for the system/software • Artifact Maturity – This is a measure of the current state of the development documentation in relation to the state of the overall development project (i.e., the is past critical design review but the requirements documents are still full of TBDs and incompletes)

  45. Determining the Scores • Using the criteria, each software component is assessed and a score generated • The scores are then processed through an algorithm to create a final score for Consequence and Error Potential • The algorithm takes into account a weight for each of the characteristics Note that the Human Safety score carries no weight. Rather it is treated in a special manner as shown on the next slide

  46. Calculating Consequence • The following algorithm is used to determine the final Consequence score } If a component has no human safety impact then Human Safety = 0 else score the Human Safety (hs) 1-5 using the criteria Score the Asset Safety (as) 1-5 using the criteria Score the Performance (pf) 1-5 using the criteria If hs > (.35as + .65pf) then Final score = hs else Final score = (.35as + .65pf) This step defines the Human Safety score (hs) } This last step is important as it places emphasis human safety by using it as an overriding score if it is larger than the sum of the weighted asset safety and performance score

  47. Calculating Error Potential • The algorithm for the Error Potential calculation has no special provisions • It is simply a sum of the weighted scores The first three terms represent the high level weights • These attributes have: • Values (vi) • generated during the assessment • Weights (wi) • pre-defined Error Potential = Note that all scores are rounded to the next whole integer

  48. Developing Tasking • A tasking set based on each individual score • Tasking associated with a given Consequence score • Tasking associated with a given Error Potential score • One set of tasks per component • The tasks are not exclusive to a given score • This results in a matrix of software components and scores that provides the starting set of requirements for IV&V on that project • The current matrix of score and tasks is provided on the next slide

  49. IV&V Tasking Matrix Items with a carat (^) next to them are only invoked when human safety is involved

  50. IV&V Tasking Matrix (2)

More Related