1 / 50

Software Verification

Software Verification. Introduction & The Model-Driven Test Design Process. Testing in the 21st Century. Today’s software market : is much bigger is more competitive has more users Embedded Control Applications airplanes, air traffic control spaceships watches ovens

gay-vang
Télécharger la présentation

Software Verification

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Software Verification Introduction&The Model-Driven Test Design Process

  2. Testing in the 21st Century • Today’s software market : • is much bigger • is more competitive • has more users • Embedded Control Applications • airplanes, air traffic control • spaceships • watches • ovens • remote controllers • increase pressure on testers • Programmers must unit test – with no training, education or tools ! • Tests are key to functional requirements – but who builds those tests ? • PDAs • memory seats • DVD players • garage door openers • cell phones

  3. Cost of Testing You’re going to spend at least half of your development budget on testing, whether you want to or not • In the real-world, testing is the principle post-design activity • Restricting early testing usually increases cost • Extensive hardware-software integration requires more testing

  4. Why Test? If you don’t start planning for each test when the functional requirements are formed, you’ll never know why you’re conducting the test • What fact is each test trying to verify? Program Managers often say: “Testing is too expensive.” • Not testing is even more expensive

  5. instrumentation specification Testing event generation event evaluation events send(42) send(42)

  6. Test Design in Context • Test Design is the process of designing input values that will effectively test software • Test design is one of several activities for testing software • Most mathematical • Most technically challenging

  7. int P[101]; int V=1; int N=1; int I; main() { while (N<101) { for (I=1;I<=N;I++) if (P[I]=0) { P[I]=V++ ; N++ ; } else if (V%P[I]==0) I=N+1; else I++} } Is this program correct? The Sieve program

  8. Types of Test Activities • Testing can be broken up into four general types of activities • Test Design • Test Automation • Test Execution • Test Evaluation • Each type of activity requires different skills, background knowledge, education and training • 1.a) Criteria-based • 1.b) Human-based

  9. 1. Test Design – (a) Criteria-Based • This is the most technical job in software testing • Test design is analogous to software architecture on the development side Design test values to satisfy coverage criteria or other engineering goal

  10. 1. Test Design – (b) Human-Based • Requires knowledge of : • Domain, testing, and user interfaces • Requires almost no traditional CS • A background in the domain of the software is essential • An empirical background is very helpful (biology, psychology, …) • A logic background is very helpful (law, philosophy, math, …) Design test values based on domain knowledge of the program and human knowledge of testing

  11. 2. Test Automation • This is slightly less technical • Requires knowledge of programming • Fairly straightforward programming – small pieces and simple algorithms • Programming is out of reach for many domain experts Embed test values into executable scripts

  12. What is JUnit? • Open source Java testing framework used to write and run repeatable automated tests (junit.org) • JUnit features include: • Assertions for testing expected results • Test features for sharing common test data • Test suites for easily organizing and running tests • Graphical and textual test runners • JUnit is widely used in industry • JUnit can be used as stand alone Java programs (from the command line) or within an IDE such as Eclipse

  13. 3. Test Execution • This is easy – and trivial if the tests are well automated Run tests on the software and record the results 4. Test Evaluation Evaluate results of testing, report to developers • This is much harder than it may seem

  14. Types of Test Activities – Summary • These four general test activities are quite different • It is a poor use of resources to use people inappropriately

  15. Model-Driven Test Design refined requirements / test specs model / structure test requirements test requirements DESIGN ABSTRACTION LEVEL IMPLEMENTATION ABSTRACTION LEVEL software artifact input values pass / fail test results test scripts test cases

  16. Model-Driven Test Design – Steps criterion refine refined requirements / test specs model / structure test requirements generate analysis test requirements DESIGN ABSTRACTION LEVEL domain analysis IMPLEMENTATION ABSTRACTION LEVEL software artifact input values feedback prefix postfix expected execute evaluate automate pass / fail test results test scripts test cases

  17. Model-Driven Test Design – Activities refined requirements / test specs model / structure test requirements Test Design DESIGN ABSTRACTION LEVEL IMPLEMENTATION ABSTRACTION LEVEL software artifact input values Test Automation pass / fail test results test scripts test cases Test Evaluation Test Execution

  18. How to cover the executions? IF (A>1)&(B=0) THEN X=X/A; END; IF (A=2)|(X>1) THEN X=X+1; END; • Choose values for A,B,X. • Value of X may change, depending on A,B. • What do we want to cover? Paths? Statements? Conditions?

  19. By choosing A=2,B=0,X=3 each statement will be chosen. The case that the tests fail is not checked! IF (A>1)&(B=0) THEN X=X/A; END; IF (A=2)|(X>1) THEN X=X+1; END; Execute every statement at least once

  20. Important Terms

  21. Validation & Verification • Validation : The process of evaluating software at the end of software development to ensure compliance with intended usage • Verification : The process of determining whether the products of a given phase of the software development process fulfill the requirements established during the previous phase

  22. According to Boehm • Verification means “we are building the product right.” • Validation means “we are building the right product”.

  23. Verification or validation Unverifiable (but validatable) spec: ... if a user presses a request button at floor i, an available elevator must arrive at floor i soon... 1 2 3 4 5 6 7 8 Example: elevator response Verifiable spec: ... if a user presses a request button at floor i, an available elevator must arrive at floor i within 30 seconds...

  24. Software Faults, Errors & Failures • Software Fault : A static defect in the software • Software Failure : External, incorrect behavior with respect to the requirements or other description of the expected behavior • Software Error : An incorrect internal state that is the manifestation of some fault Faults in software are equivalent to design mistakes in hardware. They were there at the beginning and do not “appear” when a part wears out.

  25. Testing & Debugging • Testing : Finding inputs that cause the software to fail • Debugging : The process of finding a fault given a failure

  26. Static and Dynamic Testing • Static Testing : Testing without executing the program • This include software inspections and some forms of analyses • Very effective at finding certain kinds of problems – especially “potential” faults, that is, problems that could lead to faults when the program is modified • Dynamic Testing : Testing by executing the program with real inputs

  27. White-box and Black-box Testing • Black-box testing : Deriving tests from external descriptions of the software, including specifications, requirements, and design • White-box testing : Deriving tests from the source code internals of the software, specifically including branches, individual conditions, and statements • Model-based testing : Deriving tests from a model of the software (such as a UML diagram

  28. Stress Testing • Very large numeric values (or very small) • Very long strings, empty strings • Null references • Very large files • Many users making requests at the same time • Invalid values Tests that are at the limit of the software’s expected input domain

  29. Top-Down and Bottom-Up Testing • Top-Down Testing : Test the main procedure, then go down through procedures it calls, and so on • Bottom-Up Testing : Test the leaves in the tree (procedures that make no calls), and move up to the root. • Each procedure is not tested until all of its children have been tested

  30. Test Case • Test Case Values : The values that directly satisfy one test requirement • Expected Results : The result that will be produced when executing the test if the program satisfies it intended behavior

  31. Search routine specification procedure Search (Key : INTEGER ; T: array 1..N of INTEGER; Found : BOOLEAN; L: 1..N) ; Pre-condition -- the array has at least one element 1 <= N Post-condition -- the element is found and is referenced by L ( Found and T (L) = Key) or -- the element is not in the array ( not Found and not (exists i, 1 >= i >= N, T (i) = Key ))

  32. Search routine test cases

  33. Testing levels

  34. Testing Levels Based on Test Process Maturity • Level 0 : There’s no difference between testing and debugging • Level 1 : The purpose of testing is to show correctness • Level 2 : The purpose of testing is to show that the software doesn’t work • Level 3 : The purpose of testing is not to prove anything specific, but to reduce the risk of using the software • Level 4 : Testing is a mental discipline that helps all IT professionals develop higher quality software

  35. Level 0 Thinking • Testing is the same as debugging • Does not distinguish between incorrect behavior and mistakes in the program • Does not help develop software that is reliable or safe

  36. Level 1 Thinking • Purpose is to show correctness • What do we know if no failures? • Good software or bad tests? • Test engineers have no: • Strict goal • Real stopping rule • Formal test technique • Test managers are powerless

  37. Level 2 Thinking • Purpose is to show failures • Looking for failures is a negative activity This describes most software companies.

  38. Level 3 Thinking • Testing can only show the presence of failures • Whenever we use software, we incur some risk • Risk may be small and consequences unimportant • Risk may be great and the consequences catastrophic • Testers and developers work together to reduce risk

  39. Level 4 Thinking • Testing is only one way to increase quality • Test engineers can become technical leaders of the project

  40. Testing Models

  41. main Class P Class B Class A • System testing: Test the overall functionality of the system method mA1() method mB1() • Integration testing: Test how modules interact with each other method mA2() method mB2() • Module testing: Test each class, file, module or component • Unit testing: Test each unit (method) individually Testing at Different Levels • Acceptance testing: Is the software acceptable to the user? This view obscures underlying similarities

  42. Criteria Based on Structures Structures : Four ways to model software Graphs Logical Expressions (not X or not Y) and A and B A: {0, 1, >1} B: {600, 700, 800} C: {swe, cs, isa, infs} Input Domain Characterization if (x > y) z = x - y; else z = 2 * x; Syntactic Structures

  43. 2 6 1 5 7 3 4 1. Graph Coverage – Structural • Node (Statement) • Cover every node • 12567 • 1343567 • Edge (Branch) • Cover every edge • 12567 • 1343567 • 1357 • Path • Cover every path • 12567 • 1257 • 13567 • 1357 • 1343567 • 134357 … • This graph may represent • statements & branches • methods & calls • components & signals • states and transitions

  44. Example Software Artifact : Java Method /** * Return index of node n at the * first position it appears, * -1 if it is not present */ public intindexOf (Node n) { for (inti=0; i < path.size(); i++) if (path.get(i).equals(n)) return i; return -1; } Control Flow Graph 1 i = 0 2 i < path.size() if 3 5 4 return -1 return i

  45. Guard (safety constraint) Trigger (input) [Ignition = off] | Button2 Driver 2 Configuration Driver 1 Configuration [Ignition = off] | Button1 seatBack () [Ignition = on] | [Ignition = on] | seatBottom () New Configuration Driver 2 Ignition = off [Ignition = on] | lumbar () (to Modified) [Ignition = on] | sideMirrors () Ignition = off [Ignition = on] | Reset AND Button2 Modified Configuration New Configuration Driver 1 [Ignition = on] | Reset AND Button1 1. Graph - FSM ExampleMemory Seats in a Lexus ES 300

  46. Logical Expressions 2. Logical Expressions ( (a > b) or G ) and (x < y) Transitions Program Decision Statements Software Specifications

  47. 2. Logic – Active Clause Coverage ( (a > b) or G ) and (x < y) 1 T F T 2 F F T With these values for G and (x<y), (a>b) determines the value of the predicate 3 F T T 4 F F T 5 T T T 6 T T F

  48. 3. Input Domain Characterization • Describe the input domain of the software • Identify inputs, parameters, or other categorization • Partition each input into finite sets of representative values • Choose combinations of values • System level • Number of students { 0, 1, >1 } • Level of course { 600, 700, 800 } • Major { swe, cs, isa, infs } • Unit level • Parameters F (int X, int Y) • Possible values X: { <0, 0, 1, 2, >2 }, Y : { 10, 20, 30 } • Tests • F (-5, 10), F (0, 20), F (1, 30), F (2, 10), F (5, 20)

  49. 4. Syntactic Structures • Based on a grammar, or other syntactic definition • Primary example is mutation testing • Induce small changes to the program: mutants • Find tests that cause the mutant programs to fail: killing mutants • Failure is defined as different output from the original program • Check the output of useful tests on the original program • Example program and mutants if (x > y) if (x >= y) z = x - y;  z = x + y;  z = x – m; else z = 2 * x; if (x > y) z = x - y; else z = 2 * x;

  50. Graphs Logic Input Space Syntax Applied to Applied to Applied to FSMs Source DNF Specs Source Models Source Specs Integ Input Design Use cases Coverage Overview Four Test Modeling Software

More Related