1 / 12

Validation Workshop

Validation Workshop. Dr. Butch Caffall Director, NASA IV&V Facility 26SEP07. Bryan O’Connor Challenge to NASA IV&V.

Télécharger la présentation

Validation Workshop

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Validation Workshop Dr. Butch Caffall Director, NASA IV&V Facility 26SEP07

  2. Bryan O’Connor Challenge to NASA IV&V NASA IV&V has a strong verification approach but there is little in the way of validation. The challenge to you is to develop a validation approach that will complement the verification approach so that we truly have Independent Verification and Validation to support NASA programs and projects.

  3. What is IV&V? • Validation: • Are you building the right product? • That is, did you meet the operational need • Verification: • Are you building the right product right? • That is, did you meet the validated specification? • Independence: IEEE defines independence in IV&V in terms of three parameters: • Technical Independence: Achieved by IV&V personnel who use their expertise to assess development processes and products independent of the developer. • Managerial Independence: Requires responsibility for the IV&V effort to be vested in an organization separate from the organization responsible for performing the system implementation. • Financial Independence: Requires that control of the IV&V budget be vested in an organization independent of the development organization.

  4. Focus of Validation Workshop • Validation: • Are you building the right product? • That is, did you meet the operational need • Verification: • Are you building the right product right? • That is, did you meet the validated specification? • Independence: IEEE defines independence in IV&V in terms of three parameters: • Technical Independence: Achieved by IV&V personnel who use their expertise to assess development processes and products independent of the developer. • Managerial Independence: Requires responsibility for the IV&V effort to be vested in an organization separate from the organization responsible for performing the system implementation. • Financial Independence: Requires that control of the IV&V budget be vested in an organization independent of the development organization.

  5. IEEE Standard 1012-2004 for Software Verification and Validation • “Software V&V is an extension of program management and systems engineering …” (Introduction)  • “Software V&V processes determine whether the development products of a given activity conform to the requirements of that activity and whether the software satisfies its intended use and user needs.”  (paragraph 1) • “Since software directly affects system behavior and performance, the scope of V&V processes is the software system, including the operational environment, operators and users, hardware, and interfacing software.”  (paragraph 1.3) • “V&V processes provide an objective assessment of software products and processes throughout the software lifecycle.  This assessment demonstrates whether the software requirements and system requirements (i.e., those allocated to software) are correct, complete, accurate, consistent, and testable.”  (paragraph 1.4) • “The IV&V effort should formulate its own understanding of the problem and how the proposed system is solving the problem.” (Appendix C, paragraph C.1) • “For software tools, technical independence means that the IV&V effort uses or develops its own set of test and analysis tools separate from the developer’s tools.” (Appendix C, paragraph C.1) 

  6. Validation Question How do we know whether we are acquiring the right product?

  7. Validation Examples • Is this the right carburetor? • Is this the right house? • Is this the right tool? • Is this the right software application? Do we validate based on the carburetor specifications? House blueprints? Tool specifications? Application requirements?

  8. Validation Examples (continued) To answer the questions, we need to understand the operational need: For the carburetor, we need to understand the operator’s need for a vehicle. For the house, we need to understand the future owner’s need for a building. For the tool, we need to understand the user’s task at hand. For the software application, we need to understand the goals of the system. A product cannot be validated with closed perspective of the view of only that product. We must understand and validate product in terms of its intended use.

  9. Statement of the Validation Problem To validate some item or artifact, one must have a point of reference that represents goodness. For the validation of software, what is that point of reference? • System requirements? We trace software requirements back to the system requirements to assure ourselves that the software requirements have a basis for being; however, to what degree are the system requirements (those allocated to software) valid? • Subject Matter Experts? SMEs tend to understand how a system may function; however, the application of SMEs for validation purposes raises issues of consistency of opinions, depth of knowledge, and behavior coverage. • CONOPs? The CONOPs represent the users’/stakeholders’ interest in the system – the CONOPs seems to be the appropriate beginning for validation; however, it is insufficient for software validation.

  10. Point of Reference for Software Validation • Link software to goals of the system – user needs • Develop understanding of the contributions of the software in the system • Assess artifacts throughout development lifecycle with particular emphasis on early-lifecycle activities: architecture and requirements • Assess software-development products with respect to system goals • Point of reference is System Reference Model (SRM) • Focus is on behaviors from software contributions • What the system should do • What the system should not do • What the system should do under adverse conditions NASA IV&V Software Validation is from perspective of software contributions to system behaviors that accomplish user goals.

  11. NASA IV&V System Reference Model • Recall from IEEE Standard for Software Verification and Validation (1012-2004) • “The IV&V effort should formulate its own understanding of the problem and how the proposed system is solving the problem.” • “For software tools, technical independence means that the IV&V effort uses or develops its own set of test and analysis tools separate from the developer’s tools.” • Composition of NASA IV&V System Reference Model • Goals derived from CONOPs • Use Cases that outline what must happen to achieve the goals • Unified Modeling Language (UML) artifacts that depict the desired behaviors that the software will exhibit/contribute to the system • Assertions developed to add precision to software requirements

  12. Goals of Validation Workshop • Shared understanding of the software validation problem • Discussion on proposed methodologies for software validation • Discussion on automation that could support software validation • Proposed ideas on other methodologies and tools that might support software validation

More Related