1 / 78

Verification and Validation

Verification and Validation. Verification: checks that the program conforms to its specification. Are we building the product right? Validation: checks that the program as implemented meets the expectations of the user. Are we building the right product?. Static Verification.

cicily
Télécharger la présentation

Verification and Validation

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Verification and Validation • Verification: checks that the program conforms to its specification. • Are we building the product right? • Validation: checks that the program as implemented meets the expectations of the user. • Are we building the right product?

  2. Static Verification • Program inspection • Formal method

  3. Verification and Proofs of Correctness • Formally specify the desired functionality, then verify that the program is a correct implementation of the specification.

  4. Hoare's Rules • Program fragments and assertions are composed into triples {P}S{Q} • where P is the precondition assertion, Q is the postcondition assertion,and S is program statements. • Interpretation: If P is true before S is executed, then when S terminates, Q is satisfied.

  5. Proofs of Correctness • Partial correctness: if the precondition is true, and the program terminates, then the postcondition is satisfied. • Total correctness: is partial correctness, plus a proof of termination.

  6. The Assignment Rule {P} x:=f {Q} • where P and Q are the same, except that all the occurrences of x in P have been replaced by f. • Backward substitution • {x=5} x = x + 1 { x = 6} • {z > y + 50} x = z – 43 { x > y + 7}

  7. Rule for Sequencing Statements{F1} S1; S2{F3} {F1}S1{F2}, {F2}S2{F3} {F1}S1;S2{F3}

  8. Rule for Conditions and Loops {P & C} S1 {Q}, {P & ~C} S2 {Q} {P} if C then S1 else S2 endif {Q} {I & C} S { I } {I} while C do S od {I & ~C}

  9. Software Testing • Software Requirements Specifications • Describes the expected runtime behaviors of the software. • A Test Plan • Describe how to test each behavior. • The software (source code or executable)

  10. Testing • Failure: • The departure of program operation from user requirements. • Fault: • A defect in a program that may cause a failure. • Error: • Human action that results in software containing a fault.

  11. The Test Plan • A ``living document''. It is born with the system and evolves as the system evolves. It is what the key decision makers use to evaluate the system. • User objectives. • System Description and traceability Matrices. • Special Risk Elements. • Required Characteristics-- operational and technical. • Critical Test Issues -- operational and technical.

  12. Management Plan • Integrated Schedule • Roles and Responsibilities • Resources and Sharing

  13. Verification Outline • Verification to Date • Previous Results • Testing Planned • Unresolved Issues • Issues arising during this phase • Scope of Planned tests • Test Objectives • Special Resources • Test Articles

  14. Validation Outline • Validation to Date • Previous Results • Testing Planned • Unresolved Issues • Issues arising during this phase • Scope of Planned tests • Test Objectives • Special Resources • Test Articles

  15. Test Results and Traceability • Test Procedures • Test Reporting • Development Folders

  16. Types of Faults • algorithmic faults • computation and precision faults • documentation faults • stress or overloaded faults • capacity or boundary faults • timing or coordination faults • throughput or performance faults

  17. IBM Orthogonal Defect Classification • Function: fault that affects capability, end-user interfaces, product interface with hardware architecture, or global data structure. • Interface: fault in interfacing with other components or drives via calls, macros, control blocks, or parameter lists. • Checking: fault in program login that fails to validate data and values properly before they are used. • Assignment: fault in data structure or code block initialization. • Timing/serialization: fault that involves timing of shared and real-timeresources. • Build/package/merge: fault that occurs because of problems in repositories, management changes or version control. • Documentation: fault that affects publications and maintenance notes • Algorithm: fault involving efficiency or correctness of algorithm or data structure but not design.

  18. The Testing Process • Unit testing • Component testing • Integration testing • System testing • Acceptance testing

  19. Testing Strategies • Top-down testing • Bottom-up testing • Thread testing • Stress testing • Back-to-back testing

  20. Traditional Software Testing Techniques • Black box testing • program specifications : functional testing • operational profile: random testing, partition testing • White box testing • statement coverage • branch coverage • data flow coverage • path coverage • Others • Stress testing • Back-to back testing

  21. Defect Testing • Black-box testing • Interface testing • Structural testing

  22. Black-Box Testing • Graph-based testing methods • Equivalence Partitioning • Boundary value analysis

  23. Graph-Based Testing • Transaction flow modeling • Finite state modeling • Data flow modeling • Timing modeling

  24. Partition Testing • If an input condition specifies a range, one valid and two invalid equivalence classes are defined. • If an input condition requires a specific value, one valid and twoinvalid equivalence classes are defined. • If an input condition specifies a member of a {\em set}, one valid andone invalid class are defined. • If an input condition is boolean, one valid and one invalid class are defined.

  25. Boundary Value Analysis • If an input condition specifies a range bounded by values a} and b, test cases should be designed with values a and b , justabove and just below a and b, respectively. • If an input condition specifies a number of values, test cases should be developed that exercise the minimum and maximum numbers.Values just above and below minimum and maximum are also tested. • Apply guidelines 1 and 2 to output conditions. • If internal program data structures have prescribed boundaries,be certain to design a test case to exercise the data structure at its boundary

  26. Interface Testing • Parameter interfaces • Shared memory interfaces • Procedural interfaces • Message passing interfaces

  27. Integration Testing • top-down integration • bottom-up integration • incremental testing

  28. White-Box Testing • Statement Coverage Criterion • Branch coverage criterion • Data flow coverage criterion • Path coverage criterion

  29. Statement Coverage while (…) { …. while (…) { ….. break; } ….. break;

  30. Branch Coverage if ( StdRec != null) StdRec.name = arg[1]; …… ……. write (StdRec.name)

  31. Data Flow • Data-flow graph: is defined on the control flow graph by defining sets of variables DEF(n), C-USE(n) and P-USE(n), for each node n. • Variable x is in DEF(n) if • n is the start node and $x$ is a global, parameter, or static localvariable. • x is declared in a basic block n with an initializer. • x is assigned to the basic block with the =, op=, ++, or --operator.

  32. read (x, y); if x> 0 z = 1; else z = 0; if y < 0 write (z) else write (y / z); Path : 2n Data flow : n4 n is the number of conditional statements Branch: Cyclomatic complexity (McCabe) CC(G) = #E - #N + 2P. Data flow Testing

  33. C-Use • Variable x is in C-USE(n) if x occurs as a C-USE expression in basic block as • a procedure argument, • an initializer in a declaration, • a return value in a return statement, • the second operand of “=“ • either operand of “op=“ • the operand of ++, --, * • the first operand of . , =>

  34. P-Use • Variable x is in P-USE(n) if x occurs as a P-USE expression in basic block • as the conditional expression in a if, for, while, do, or switch statement. • as the first operand of the condition expression operator (?:), the logical and operator (&&), or the logical or operator (||)

  35. C-Use Coverage • A C-Use is a variable x and the set of all paths in the data-flow graph from node na to nb such that • x is in DEF(na), and • x is not in DEF(ni) for any other node ni on the paths (definition clear path), and • x is in C-USE(nb). • A C-Use is covered by a set of tests if at least one of the paths in the C-Use is executed when the test is run.

  36. P-Use Coverage • A P-Use is a variable x and the set of all paths in the data-flow graph from node na to nb such that • x is in DEF(na), and • x is not in DEF(ni) for any other node ni on the paths (definition clear path), and • x is in P-USE(nb). • A P-Use is covered by a set of tests if at least one of the paths in the P-Use is executed when the test is run.

  37. read (x, y); if x> 0 z = 1; else z = 0; if y < 0 write (z) else write (y / z); x = 1, y = 1 -> 1 x = -1, y = -1 -> 0 x = 0, y = 0 -> error Path Coverage X>0 Y < 0

  38. all-paths all-du Paths all-uses all-p-uses/some-c-uses all-p-uses Branch Statement all-c-uses/some-p-uses all-c-uses all-defs

  39. Complexity of White-Box Testing • Branch Coverage: • McCabe’s Cyclomatic complexity: CC(G) = #E - #V + 2P (G = (E, V)) • All-defs: M + I * V • All-p-uses, all-c-uses/some-p-uses, all-p-uses/some-c-uses, all-uses: N 2 • All-du, All-Path: 2N: N is the number of conditional statements

  40. Mutation Testing • A program under test is seeded with a single error to produce a ``mutant'' program. • A test covers a mutant if the output of the mutant and the program under test differ for that test input. • The mutation coverage measure for a test set is the ratio ofmutants covered to total mutants

  41. DebuggingProgram Slicing Approach • Static slicing: decomposes a program by statically analyzing data-flow and control flow of the program. • A static program slice for a given variable at a given statementcontains all the executable statements that could influence the value of that variable at the given statement. • The exact execution path for a given input is a subset of the static program slice with respect to the output variables at the given checkpoint. • Focus is an automatic debugging tool based on static program slicing to locate bugs.

  42. Dynamic Slicing • dynamic data slice: A dynamic data slice with respect to a given expression, location, and test case is a set of all assignments whose computations have propagated into the current value of the given expression at the given location. • dynamic control slice:A dynamic control slice with respect to a given location and test case is a set of all predicates that enclose the given location.

  43. Object-Oriented Testing Process • Unit testing - class • Integration testing - cluster • System testing - program

  44. Class Testing • Inheritance • Polymorphism • Sequence

  45. Class Testing Strategies • Testing inheritance • Testing polymorphism • State-oriented testing • Data Flow testing • Function dependence class testing

  46. A subclass may re-define its inherited functions and other functions may be affected by the re-defined functions. When this subclass is tested, which functions need to be re-tested? Class foo { int local var; ... int f1() { return 1; } int f2() { return 1/f1(); } } Class foo child :: Public foo { // child class of foo int f1() { return 0; } } Inheritance

  47. Testing Inheritance ``Incremental testing of object-oriented class structures.'' Harrold et al., (1992) • New methods: complete testing • Recursive methods: limited testing • Redefined methods: reuse test scripts

  48. An object may be bound to different classes during the run time. Is it necessary to test all the possible bindings? // beginning of function foo { ... P1 p; P2 c; ... return(c.f1()/p1.f1()); // end of function foo Polymorphism

  49. Testing Polymorphism • ``Testing the Polymorphic Interactions between Classes.'' McDaniel and McGregor (1994) Clemson University

  50. State-Oriented Testing • ``The state-based testing of object-oriented programs,'' 1992, C. D. Turner and D. J. Robson • ``On Object State Testing,'' 1993, Kung et al. • ``The testgraph methodology: Automated testing of collection classes,'' 1995, Hoffman and Strooper • The FREE approach: Binder http://www.rbsc.com/pages/Free.html

More Related