Download
lesson 09 software verification validation and testing n.
Skip this Video
Loading SlideShow in 5 Seconds..
Lesson 09 Software Verification, Validation and Testing PowerPoint Presentation
Download Presentation
Lesson 09 Software Verification, Validation and Testing

Lesson 09 Software Verification, Validation and Testing

249 Views Download Presentation
Download Presentation

Lesson 09 Software Verification, Validation and Testing

- - - - - - - - - - - - - - - - - - - - - - - - - - - E N D - - - - - - - - - - - - - - - - - - - - - - - - - - -
Presentation Transcript

  1. Lesson 09Software Verification, Validation and Testing • Includes: • Software Testing Techniques • Intro to Testing Includes materials adapted from Pressman: Software Engineering: A Practitioner’s Approach, Fifth Edition, McGraw-Hill, 2000 Vi and Ira Glickstein

  2. Software Testing Testing is the process of exercising a program with the specific intent of finding errors prior to delivery to the end user. From Pressman: Software Engineering: A Practitioner’s Approach, Fifth Edition, McGraw-Hill, 2000 Vi and Ira Glickstein

  3. We Design Test Cases to... • have high likelihood of finding errors • exercise the internal logic of software components • exercise the input and output to uncover errors in program function, behavior, and performance Goal is to find maximum number of errors with the minimum amount of effort and time! From Pressman: Software Engineering: A Practitioner’s Approach, Fifth Edition, McGraw-Hill, 2000 Vi and Ira Glickstein

  4. Testing is “Destructive” Activity • designing and executing test cases to “break” or “demolish” the software. • Must change your mindset during this activity The objective is to find errors therefore errors found are good not bad. Tell that to a manager! From Pressman: Software Engineering: A Practitioner’s Approach, Fifth Edition, McGraw-Hill, 2000 Vi and Ira Glickstein

  5. Testing Objectives • Execute a program with intent of finding an error • Good test case has high probability of finding an as-yet undiscovered error • Successful test case finds an as-yet undiscovered error Successful testing uncovers errors. From Pressman: Software Engineering: A Practitioner’s Approach, Fifth Edition, McGraw-Hill, 2000 Vi and Ira Glickstein

  6. Testing demonstrates ... • Software functions work as specified • Behavioral and performance requirements appear to be met • Data collected is an indicator of reliability and quality TESTING CANNOT SHOW THE ABSENCE OF ERRORS AND DEFECTS. Testing only shows that errors and defects are present. From Pressman: Software Engineering: A Practitioner’s Approach, Fifth Edition, McGraw-Hill, 2000 Vi and Ira Glickstein

  7. Basic Principles of Testing • All testing traceable to requirement • Plan testing long before testing begins. Plan and design tests during design before any code has been generated. • Pareto Principle - 80% errors in 20% of components • Start small and progress to large. First test individual components (unit test), then on clusters of integrated components (integration test), then on whole system • Exhaustive testing not possible but we can assure that all conditions have been exercised • All testing should not be done by developer - need independent 3rd party From Pressman: Software Engineering: A Practitioner’s Approach, Fifth Edition, McGraw-Hill, 2000 Vi and Ira Glickstein

  8. Testability • Operability —it operates cleanly • Observability—the results of each test case are readily observed • Controllability—the degree to which testing can be automated and optimized • Decomposability—control scope of testing • Simplicity—reduce complex architecture and logic to simplify tests • Stability—few changes are requested during testing • Understandability—of the design and documents Testability refers to how easily product can be tested. Design software with “Testability” in mind. From Pressman: Software Engineering: A Practitioner’s Approach, Fifth Edition, McGraw-Hill, 2000 Vi and Ira Glickstein

  9. What Testing Shows • Errors • Requirements conformance • Performance • An indication of quality From Pressman: Software Engineering: A Practitioner’s Approach, Fifth Edition, McGraw-Hill, 2000 Vi and Ira Glickstein

  10. Who Tests the Software? Developer IndependentTester Understands the system but will test “gently” and is driven by “delivery” Must learn about the system but will attempt to break it and is driven by quality From Pressman: Software Engineering: A Practitioner’s Approach, Fifth Edition, McGraw-Hill, 2000 Vi and Ira Glickstein

  11. Software Testing • Black Box Testing Methods • White Box Testing Methods • Strategies for Testing From Pressman: Software Engineering: A Practitioner’s Approach, Fifth Edition, McGraw-Hill, 2000 Vi and Ira Glickstein

  12. Black Box Testing • Based on specified function, on the requirements • Tests conducted at the software interface • Demonstrates that the software functions are operational, input is properly accepted, output is correctly produced, and integrity of external info is maintained • Uses the SRS as basis for construction of tests • Usually performed by independent group From Pressman: Software Engineering: A Practitioner’s Approach, Fifth Edition, McGraw-Hill, 2000 Vi and Ira Glickstein

  13. White-Box Testing … Our goal is to ensure that all statements and conditions have been executed at least once. From Pressman: Software Engineering: A Practitioner’s Approach, Fifth Edition, McGraw-Hill, 2000 Vi and Ira Glickstein

  14. White Box Testing -- I • Based on internal workings of a product; requires close examination of software • Logical paths are tested by providing test cases that exercise specific sets of conditions and/or loops • Check status of program by comparing actual results to expected results at selected points in the software Exhaustive path testing is impossible From Pressman: Software Engineering: A Practitioner’s Approach, Fifth Edition, McGraw-Hill, 2000 Vi and Ira Glickstein

  15. Exhaustive Testing loop < 20 times 14 There are 10 possible paths! If we execute one test per millisecond, it would take 3,170 years to test this program!! From Pressman: Software Engineering: A Practitioner’s Approach, Fifth Edition, McGraw-Hill, 2000 Vi and Ira Glickstein

  16. White Box Testing -- II • Logic errors and incorrect assumptions usually occur with special case processing • Our assumptions about flow of control and data may lead to errors that are only uncovered during path testing • We make typing errors; some uncovered by compiler (syntax, type checking) BUT others only uncovered by testing. Typo may be on obscure path Black box testing can miss these types of errors From Pressman: Software Engineering: A Practitioner’s Approach, Fifth Edition, McGraw-Hill, 2000 Vi and Ira Glickstein

  17. Selective Testing Selectedpath loop < 20 times From Pressman: Software Engineering: A Practitioner’s Approach, Fifth Edition, McGraw-Hill, 2000 Vi and Ira Glickstein

  18. Software Testing Techniques Testing Analysis Includes materials adapted from Pressman: Software Engineering: A Practitioner’s Approach, Fifth Edition, McGraw-Hill, 2000 Vi and Ira Glickstein

  19. Test Case Design Uncover errors in a complete manner with a minimum of effort and time! From Pressman: Software Engineering: A Practitioner’s Approach, Fifth Edition, McGraw-Hill, 2000 Vi and Ira Glickstein

  20. Basis Path Testing -- I • A white box testing technique - McCabe • Use this technique to derive a logical measure of complexity • Use as a guide for defining a “basis set” of execution paths • Test cases derived to execute the basis set are guaranteed to execute every statement at least one time during testing From Pressman: Software Engineering: A Practitioner’s Approach, Fifth Edition, McGraw-Hill, 2000 Vi and Ira Glickstein

  21. Cyclomatic Complexity • This is a quantitative measure of the logical complexity of a program. • Used in conjunction with basis set testing it defines the number of independent paths in the basis set • It provides an upper bound for the number of tests that ensure all statements have been executed at least once. • See http://www.mccabe.com/pdf/nist235r.pdf for more detailed paper McCabe’s Cyclomatic Complexity. From Pressman: Software Engineering: A Practitioner’s Approach, Fifth Edition, McGraw-Hill, 2000 Vi and Ira Glickstein

  22. Basis Path Testing -- II First, we compute the cyclomatic Complexity:: 1 A number of simple decisions + 1 1,2,3=3 decisions+1 B 2 C or number of enclosed areas + 1 A,B,C=3 areas +1 3 In this case, V(G) = 4 From Pressman: Software Engineering: A Practitioner’s Approach, Fifth Edition, McGraw-Hill, 2000 Vi and Ira Glickstein

  23. Cyclomatic Complexity A number of industry studies have indicated that the higher the V(G), the higher the probability of errors. modules V(G) modules in this range are more error prone From Pressman: Software Engineering: A Practitioner’s Approach, Fifth Edition, McGraw-Hill, 2000 Vi and Ira Glickstein

  24. 1 2 3 4 5 6 7 8 Basis Path Testing -- III Next, we derive the independent paths: Since V(G) = 4, there are four paths Path 1: 1,2,3,6,7,8 Path 2: 1,2,3,5,7,8 Path 3: 1,2,4,7,8 Path 4: 1,2,4,7,2,...7,8 Note the … implies insertion of path 1, 2, or 3 here. Finally, we derive test cases to exercise these 8 Paths. From Pressman: Software Engineering: A Practitioner’s Approach, Fifth Edition, McGraw-Hill, 2000 Vi and Ira Glickstein

  25. Creating Flow Graphs • Circle (node) represents one or more statements • Arrows (edges) represent flow or control. Must terminate in a node. • Region is an area bounded by edges and nodes. The area outside the flow graph is included as a region. • Exercise - translate the flowchart on previous slide into flow graph From Pressman: Software Engineering: A Practitioner’s Approach, Fifth Edition, McGraw-Hill, 2000 Vi and Ira Glickstein

  26. Calculating Cyclomatic Complexity from Flow Graph • Count the number of regions • V(G) = E - N + 2 • where E = number of edges • N = number of nodes • V(G) = P + 1 • where P = number of predicate nodes (2 or more edges leave the node) From Pressman: Software Engineering: A Practitioner’s Approach, Fifth Edition, McGraw-Hill, 2000 Vi and Ira Glickstein

  27. Basis Path Testing Notes • - You don’t need a flow chart or graph but the picture helps when you trace program paths • - Count each simple logical test as 1, compound tests count as 2 or more (depending on number of tests) • Basis Path Testing should be applied to critical modules. • Some Development Environments will automate calculation of V(G) From Pressman: Software Engineering: A Practitioner’s Approach, Fifth Edition, McGraw-Hill, 2000 Vi and Ira Glickstein

  28. Deriving Test Cases • Using design or code as a foundation, draw a corresponding flow graph • Determine the cyclomatic complexity • Identify the basis set of linearly independent paths • Prepare test cases that will force execution of each path in the basis set • Exercise - create flow graph from example From Pressman: Software Engineering: A Practitioner’s Approach, Fifth Edition, McGraw-Hill, 2000 Vi and Ira Glickstein

  29. Graph Matrices • Software tools exist that use a graph matrix to derive the flow graph and determine the set of basis paths • Square matrix whose size equals the number of nodes on the flow graph • Each node is identified by number and each edge by letter • Can add link weight for other more interesting properties (e.g. processing time, memory required, etc.) From Pressman: Software Engineering: A Practitioner’s Approach, Fifth Edition, McGraw-Hill, 2000 Vi and Ira Glickstein

  30. Control Structure Testing • Basis path testing is not enough • Must broaden testing coverage and improve quality of testing • Condition Testing • Data Flow Testing • Loop Testing From Pressman: Software Engineering: A Practitioner’s Approach, Fifth Edition, McGraw-Hill, 2000 Vi and Ira Glickstein

  31. Condition Testing --I • Exercise the logical conditions in a program module • Boolean variable or relational expression • Compound conditions - one or more conditions • Detect errors in conditions AND also in rest of program. If test set is effective for conditions, likely also for other errors. From Pressman: Software Engineering: A Practitioner’s Approach, Fifth Edition, McGraw-Hill, 2000 Vi and Ira Glickstein

  32. Condition Testing --II • Branch Testing - test each True and False branch at least once • Domain Testing - 3 or 4 tests for a relational expression. Test for greater than, equal to, less than. Also a test which makes the difference between the 2 values as small as possible. From Pressman: Software Engineering: A Practitioner’s Approach, Fifth Edition, McGraw-Hill, 2000 Vi and Ira Glickstein

  33. Data Flow Testing • Selects test paths according to the locations of definitions and uses of variables in the program. • Can’t use for large system but can target for suspect areas of the software • Useful for selecting test paths containing nested if and loop statements From Pressman: Software Engineering: A Practitioner’s Approach, Fifth Edition, McGraw-Hill, 2000 Vi and Ira Glickstein

  34. Loop Testing • White box technique focuses on validity of loop constructs • Four different types of loops: • Simple loops • Nested loops • Concatenated loops • Unstructured loops - should redesign to reflect structured constructs From Pressman: Software Engineering: A Practitioner’s Approach, Fifth Edition, McGraw-Hill, 2000 Vi and Ira Glickstein

  35. Loop Testing Simple loop Nested Loops Concatenated Loops Unstructured Loops From Pressman: Software Engineering: A Practitioner’s Approach, Fifth Edition, McGraw-Hill, 2000 Vi and Ira Glickstein

  36. Loop Testing: Simple Loops Minimum conditions—Simple Loops 1. skip the loop entirely 2. only one pass through the loop 3. two passes through the loop 4. m passes through the loop m < n 5. (n-1), n, and (n+1) passes through the loop where n is the maximumnumber of allowable passes From Pressman: Software Engineering: A Practitioner’s Approach, Fifth Edition, McGraw-Hill, 2000 Vi and Ira Glickstein

  37. Loop Testing: Nested Loops Nested Loops • Start at the innermost loop. Set all outermost loops to their minimum values. • Test the min+1, typical, max-1 and max for the innermost loop while holding the outermost loops at minimum values. • 3. Move out one loop and set it up as in step 2 holding all loops at typical values until the outermost loop has been tested. ConcatenatedLoops If the loops are independent of each other then treat as simple loops. Otherwise treat as nested loops. From Pressman: Software Engineering: A Practitioner’s Approach, Fifth Edition, McGraw-Hill, 2000 Vi and Ira Glickstein

  38. Black-Box Testing requirements output input events Also called behavioral testing From Pressman: Software Engineering: A Practitioner’s Approach, Fifth Edition, McGraw-Hill, 2000 Vi and Ira Glickstein

  39. Black Box Testing • Does not replace white box testing • A complementary approach • Focuses on functional requirements of the software • Tries to find following types of errors: • incorrect or missing functions • interface errors • errors in data structures or database access • behavior or performance errors • initialization or termination errors From Pressman: Software Engineering: A Practitioner’s Approach, Fifth Edition, McGraw-Hill, 2000 Vi and Ira Glickstein

  40. Black Box Testing • Done during later stages of testing • Tests designed to answer following questions • how is functional validity tested? • How is system behavior and perf tested? • What classes of input will make good test cases? • Is system sensitive to certain input values? • How are the boundaries of a data class isolated? • What data rates and data volume can the system take? • What effect will specific data comb. have on system From Pressman: Software Engineering: A Practitioner’s Approach, Fifth Edition, McGraw-Hill, 2000 Vi and Ira Glickstein

  41. Equivalence Partitioning • Black box method that divides the input domain of a program into classes of data from which test cases can be derived • Strive to design a test case that uncovers classes of errors and reduces the total number of test cases that must be developed and run. E.g. incorrect processing of all character data From Pressman: Software Engineering: A Practitioner’s Approach, Fifth Edition, McGraw-Hill, 2000 Vi and Ira Glickstein

  42. Equivalence Partitioning user queries data output formats mouse picks errors prompts From Pressman: Software Engineering: A Practitioner’s Approach, Fifth Edition, McGraw-Hill, 2000 Vi and Ira Glickstein

  43. Sample Equivalence Classes Valid data • User supplied commands • Responses to system prompts • Filenames • Computational Data • physical parameters • bounding values • initiation values • Output data formatting • Responses to error msgs • Graphical data (e.g. mouse picks) Invalid data • Data outside bounds of the program • Physically impossible data • Proper value supplied in the wrong place From Pressman: Software Engineering: A Practitioner’s Approach, Fifth Edition, McGraw-Hill, 2000 Vi and Ira Glickstein

  44. Equivalence Class DefinitionGuidelines • Input condition specified range - one valid and 2 invalid classes defined • Input condition requires specific value, one valid and 2 invalid classes defined • Input condition specifies a number of a set, one valid and one invalid class defined • Input condition is Boolean, one valid and one invalid class defined • E.g. prefix - 3 digit number not beginning with 0 or 1; Input condition: range - specified value >200; Input condition: value - 4 digit length From Pressman: Software Engineering: A Practitioner’s Approach, Fifth Edition, McGraw-Hill, 2000 Vi and Ira Glickstein

  45. Boundary Value Analysis • More errors occur at boundary of the input domain • BVA leads to selection of test cases that exercise the boundaries • Guidelines: • Input in range a..b: select a, b, just above and just below a and b • Inputs with number of values: select min and max, just above and below min, max • Use same guidelines for output conditions • boundaries on data structures (array with 100 entries): test at boundary From Pressman: Software Engineering: A Practitioner’s Approach, Fifth Edition, McGraw-Hill, 2000 Vi and Ira Glickstein

  46. Software Testing Strategies Vi and Ira Glickstein

  47. Testing Goals - Review • Goal is to discover as many errors as possible with minimum effort and time • Destructive activity - people who constructed the sw now asked to test it • Vested interest in showing sw is error-free, meets requirements, and will meet budget and schedule • Works against thorough testing • Therefore, should the developer do no testing? Should all testing be done independently and testers get involved only when developers finished with construction? From Pressman: Software Engineering: A Practitioner’s Approach, Fifth Edition, McGraw-Hill, 2000 Vi and Ira Glickstein

  48. Testing Strategies ... • In the past, only defense against programming errors was careful design and the intelligence of the programmer • Now we have modern design techniques and formal technical reviews to reduce the number of initial errors in the code • In Chapter 17 we discussed how to design effective test cases, now we discuss the strategy we use to execute them. • Strategy is developed by project manager, software engineer, and testing specialists. It may also be mandated by customer. From Pressman: Software Engineering: A Practitioner’s Approach, Fifth Edition, McGraw-Hill, 2000 Vi and Ira Glickstein

  49. Why is Testing Important? • Testing often accounts for more effort than any other sw engineering activity • If done haphazardly, we • waste time • waste effort • errors sneak thru • Therefore need a systematic approach for testing software • Work product is a Test Specification (Test Plan) From Pressman: Software Engineering: A Practitioner’s Approach, Fifth Edition, McGraw-Hill, 2000 Vi and Ira Glickstein

  50. What is a Test Plan? • a road map describing the steps to be conducted • specifies when the steps are planned and then undertaken • states how much effort, time, and resources will be required • must incorporate test planning, test case design, test execution, and data collection and evaluation Should be flexible for customized testing but rigid enough for planning and management tracking. From Pressman: Software Engineering: A Practitioner’s Approach, Fifth Edition, McGraw-Hill, 2000 Vi and Ira Glickstein