Download
a framework for combining multiple test case generation techniques n.
Skip this Video
Loading SlideShow in 5 Seconds..
A Framework for Combining Multiple Test Case Generation Techniques PowerPoint Presentation
Download Presentation
A Framework for Combining Multiple Test Case Generation Techniques

A Framework for Combining Multiple Test Case Generation Techniques

164 Vues Download Presentation
Télécharger la présentation

A Framework for Combining Multiple Test Case Generation Techniques

- - - - - - - - - - - - - - - - - - - - - - - - - - - E N D - - - - - - - - - - - - - - - - - - - - - - - - - - -
Presentation Transcript

  1. A Framework for Combining Multiple Test Case Generation Techniques Troy Neilson MCS Defense Supervisor: Dr. Andrew McAllister 25 October 2005

  2. Outline • Introduction / Motivation / Goals • Background • Choice Relation Framework • Condition/Decision/Loop (CDL) Coverage • Improvements to Choice Relation Framework • Database Framework Structures • Prototype Tool • Conclusions & Future Work

  3. Introduction • Computer Software vs a Car

  4. Introduction • Software Testing • One of the most important parts of development • Requires ~40% of the development time • Many different testing techniques • Need: framework supporting multiple techniques

  5. Common Challenges • Informal system specifications • Reference material lacks practicality • Instinct is minimal, ad hoc • Time & effort constraints • Testing lacks transparency (plagiarism) • Chasm between research & practice • Small examples

  6. Future Environment • Entire automation is not beneficial • Features: • Multi-technique support • User Interface • Wizard Tools • Intermediate Results (audit trail) • Input Value Suggestions

  7. Future Environment • Features (cont..) • Automate Repetitive Processes • Test Case Execution Support • Automatic Grading • Identify Redundant Test Cases

  8. Thesis Goals • Thesis Goals: • Select two representative techniques • Solve problems with selected techniques • Create a common framework • Generate a prototype tool

  9. Long Term Goals • Long Term (future work): • Create an environment supporting the teaching of software testing. • Technique Integration: • More complete error detection • Identify redundant test cases • Cross-technique information sharing

  10. Background - Other Frameworks • TEAM Architecture (1989) • Similarities: • Supports multiple techniques • Differences: • Focus on object code & compilers • Underlying structure is vague • Design Patterns • Too generalized • Model-based frameworks

  11. Background - Structural and Functional • Functional Testing (Black-box) • Based on system requirements • No knowledge of code • e.g. National brake safety standards • Structural Testing (White-box) • Based on system internals • e.g. brake expert • Combination is best solution

  12. Background - Two Selected Techniques • Criteria: • One structural and one functional • Based on informal specifications • Representative of other techniques • Specific application details • Hands-on

  13. Background - Testing Techniques

  14. Background - Choice Relation Framework • Category-Partition Method (CPM) • Based on informal specifications • Partitions the input domain • Comprehensive, hands-on and intuitive • Choice Relation Framework Improvements • Choice relation table • More structured test case generation method • Consistency checks • Automatic deductions of relations

  15. Background - Choice Relation Framework • An Example: Write a program that will count the number of lines in a (text) file. The filename should be passed as a command-line argument.

  16. Background - Choice Relation Framework • Break down of the example: • Step 1 – Functional Units • Line count program • Step 2 – Input Parameters (Or Unit Inputs) • Filename & File Contents • Step 3 – Categories (major characteristics) • Filename  Filename • File Contents  Number of lines • File Contents  Embedded blank lines

  17. Background - Choice Relation Framework • Step 4 – Choices (an equivalent class) • Filename  No Filename (NFN), Valid Filename (VFN), Invalid Filename (IFN) • Number of Lines  One Line (OL), Multiple Lines (ML) • Embedded Blank Lines  One (OBL), Multiple (MBL)

  18. Background - Choice Relation Framework

  19. Background - Choice Relation Framework • Step 5 – Create a choice relation table • Choice Relations: • Fully Embedded  • Partially Embedded  • Not Embedded 

  20. Background - Choice Relation Framework

  21. Background - Choice Relation Framework • Test Frame – Unordered set of choices • e.g. β = {valid filename, one line} • Test Case – Replace each choice in a test frame with a value • e.g. β = {“00014.txt”, “third of the way”}

  22. Background - Choice Relation Framework • Step 6 – Priority Table (optional) • Step 7 – Create Test Frames • 10 Propositions • 5 Construction Rules • Step 8 – Create Test cases

  23. Background - CDL Coverage • Condition/Decision/Loop Coverage • Teaching Approach for Software Testing • Straight-forward • Loop Coverage • Condition/Decision Coverage if (count < MAX_COUNT // condition 1 && count > MIN_COUNT) // condition 2

  24. Background - CDL Coverage • Break down of an example: • Step 1 – Functional Units • Grade program • Step 2 – Conditions, Decisions and Loops

  25. Background - CDL Coverage while ((grade >= 0.0) // lp1.1 – loop condition #1.1 && (count <= 3)){ // lp1.2 – loop condition #1.2 // lp1 – loop decision #1 if ((grade >= 85) // cd1.1 - condition #1.1 && (grade <= 100)){ // cd1.2 - condition #1.2 // cd1 - decision #1 letterGrade = "A"; } else if (grade >= 70){ // cd2 - decision #2 letterGrade = "B"; } count++; }

  26. Background - CDL Coverage • Step 3 – Conditions & Decisions Table

  27. Background - CDL Coverage • Step 4 – Loops Table

  28. Background - CDL Coverage • Step 5 – Create Test Case Table

  29. Background - CDL Coverage • Step 6 – Fill in CD Table

  30. Background - CDL Coverage • Step 7 – Fill in Loops Table • Step 8 – Execute Test Cases

  31. Improvements to Choice Relation Framework • Sets of Input Values • Category & Choice Suggestions • Invalid Test Frames • Necessary in Design of Framework

  32. Improvements - Handling Sets of Input Values • Not handled • Prevalent in real world • Three unique features • Number of elements • Order of elements • Type of elements • Solution based on SCAT

  33. Improvements - Handling Sets of Input Values • Create bunch of Sets to include in choice relation table. • Solution: • Apply the choice relation framework to: • Overall set • Element contained by the set • Generates several test frames • Each test frame represents a unique Set

  34. Improvements - Handling Sets of Input Values • Four types of Test Elements • e.g. Range of 0 to 10 • Typical e.g. 5 • Special e.g. 0, 1, 9, 10 • Illegal e.g. -1, 11 • Sentinel

  35. Improvements - Handling Sets of Input Values • Combination Rules: • One random Typical test element • Create a single element set • Each Special test element • Create a single element set • Combine with one or more typical test elements • Random order • Random number

  36. Improvements - Handling Sets of Input Values • Combination Rules: • Each Illegal & Sentinel test element • Create a single element set • Combine with one or more typical test elements • First element in set • Middle element in set • Last element in set • A set with all special test elements • A set with all typical test elements

  37. Improvements - Category & Choice Suggestions • Unit Inputs broken down into Categories • e.g. An Array  Size, Sorted, etc • Categories broken down into Choices • e.g. Size  empty, one, maximum, etc • Similar inputs have similar categories • Similar categories have similar choices • Knowledge base of Unit Inputs Types & Category Types

  38. Improvements - Eliminating Invalid Test Frames • Different choice relation table orderings • Result in different numbers of test frames • Additional test frames • Non-identical, but redundant • Invalid based on propositions • Invalid Test Frame - If a choice x is fully embedded in choice y (x  y), then a Test Frame β1 is invalid if it contains x but does not contain a choice y.

  39. Improvements - Eliminating Invalid Test Frames

  40. Improvements - Eliminating Invalid Test Frames • Generates six test frames 1 {valid_filename} 2 {valid_filename, multiple_lines} 3 {valid_filename, one_line} 4 {multiple_lines} 5 {invalid_filename} 6 {one_line} • Multiple Lines is fully embedded in VFN • Any test frame with ML, must have VFN

  41. Improvements - Eliminating Invalid Test Frames • β = Test Frame, x = current choice • Construction Rule #4 requires: • β  x and x  β. • Based on Proposition #9 and 10, its requirements are: • y  x for every choice y ε β (Reverse of Proposition 9a) • x  y for every y ε β (Proposition 10)

  42. Improvements - Eliminating Invalid Test Frames • If construction rule #4 called • x  y for every y ε β • All choices in β are stored in a list FEC (Fully Embedded Choices) • After adding x to all test frames • Each test frame β containing x • Add any choices in FEC not already in test frame • Duplicates are removed

  43. Improvements - Eliminating Invalid Test Frames • Implementation Results

  44. Framework Structure • First step towards an environment • Support both techniques & improvements • Incremental Testing Framework • Test Suite • Test Case • Test Result • Test Suite Test Case

  45. Framework - Category-Partition Method • Requires six new tables: • Functional Unit • Unit Inputs (Input Parameter) • Category • Choice • Data Type • Test Case Choice

  46. Framework - Choice Relation Framework • Requires four new tables: • Choice Relation • Relational Operator • Parent Choice Relation • Relation Type

  47. Framework - CDL Coverage • Requires two new tables: • Conditions and Decisions • Loops • Reuse the Choice Table • Category Choice Table • Grade example required series of inputs • Use framework for sets of input values?

  48. Framework - Choice Relation Framework • Handling sets of input values • Choice Type: Typical, Special, Illegal, Sentinel