Download
cowtest a cost weigh t ed test strateg y n.
Skip this Video
Loading SlideShow in 5 Seconds..
CoWTeSt: A Cost Weigh t ed Test Strateg y PowerPoint Presentation
Download Presentation
CoWTeSt: A Cost Weigh t ed Test Strateg y

CoWTeSt: A Cost Weigh t ed Test Strateg y

135 Vues Download Presentation
Télécharger la présentation

CoWTeSt: A Cost Weigh t ed Test Strateg y

- - - - - - - - - - - - - - - - - - - - - - - - - - - E N D - - - - - - - - - - - - - - - - - - - - - - - - - - -
Presentation Transcript

  1. CoWTeSt: A Cost Weighted Test Strategy

  2. Test Planning Strategy Objectives: • Decide which and how many test cases should be executed. • Give a practical help to managers to support test planning and evaluate the impact of testing the phase on the cost of the final product.

  3. Features • (Exclusively) UML based: It uses the Use Cases and Interaction diagrams developed during analysis and design phases • Cowtest strategy: development of a systematic method for UML design-based integration test of complex system. • Based exclusively on the use of UML diagrams: • immediately usable from industries already using UML (no additional modeling or design costs); • analysis can be started soon, and done contemporarily with project development; • Different strategies for test cases selection

  4. T I Test case Test case Test case Test case U Test case Test case Test case Test case Test case Test case Test case Test case Test case Test case Test case Test case Obj 1 Obj 2 Obj 3 Method1() Method2() CoWTeSt Method3() Presentation Scheme

  5. CoWTeSt: A Cost Weighted Test Strategy Weighted Tree Derivation: • Starting from the main UCD the UCs and SDs are organized in a sort of hierarchical tree • Deduce the Critical Profile: annotate every arch with a value representing “the importance” of the associate node (be it a UC or a SD scenario) • Test Case Derivation: apply UIT for test derivation

  6. Integration Stage • The first integration stage is represented by the main UC and the SDs (if any), which are children of this node (hence they are at level 2 of the tree). • Thei-th integration stage is represented by the UCs positioned in at i-th level of the tree and every SDs, children of these nodes, situated at i+1-th level. NOTE: In the i-th integration stage the SDs at level i+1 are considered, because they represent the interaction between the different components that realize the functionalities described in the UCs at i-th level of the tree.

  7. 3rd Integration Stage

  8. Int-Stage Leaves names Critical profile 2nd Stage/NTest 3rd Stage/NTest 4th stage/NTest 1st Stage SK 1 SK_SD 0.3 0.3 150 0.3 150 0.3 150 SK_NDchild 0.7 - - - - - - 2nd Stage NAM 0.05 0.05 25 0.05 25 0.05 25 NRM 0.05 0.05 25 0.05 25 0.05 25 CM 0.6 - - - - - - CM_SD 0.2 0.12 60 0.12 60 0.12 60 CM_NDchild 0.8 48 240 - - - - 3rd Stage EM 0.1 0.06 30 0.06 30 S 0.3 - - - - S_SD 0.2 0.036 18 0.036 18 S_NDchild 0.8 0.144 72 - - C 0.3 - - - - C_SD 0.10 0.018 9 0.018 9 C_NDchild 0.90 0.162 81 - - R 0.1 0.06 30 0.06 30 4th Stage OcS 0.4 0.072 36 TcS 0.4 0.072 36 DiscofConn 0.45 0.081 41 Conn 0.45 0.081 41 Test Case Selection • For every node, the product of all the nodes weights on the complete path from the root to this node represents its final weight.

  9. Int-Stage Leaves names Critical profile 2nd Stage/NTest 3rd Stage/NTest 4th stage/NTest 1st Stage SK 1 SK_SD 0.3 0.3 150 0.3 150 0.3 150 SK_NDchild 0.7 - - - - - - 2nd Stage NAM 0.05 0.05 25 0.05 25 0.05 25 NRM 0.05 0.05 25 0.05 25 0.05 25 CM 0.6 - - - - - - CM_SD 0.2 0.12 60 0.12 60 0.12 60 CM_NDchild 0.8 48 240 - - - - 3rd Stage EM 0.1 0.06 30 0.06 30 S 0.3 - - - - S_SD 0.2 0.036 18 0.036 18 S_NDchild 0.8 0.144 72 - - C 0.3 - - - - C_SD 0.10 0.018 9 0.018 9 C_NDchild 0.90 0.162 81 - - R 0.1 0.06 30 0.06 30 4th Stage OcS 0.4 0.072 36 TcS 0.4 0.072 36 DiscofConn 0.45 0.081 41 Conn 0.45 0.081 41 Test Case Selection

  10. 80% Functional Coverage

  11. Leaves names 4th Stage weights 70%coverage nwf /MinNtest 80%coverage/ nwf /MinNtest 90%coverage/ nwf /MinNtest 100%coverage/ nwf /MinNtest SK_SD 0.3 0.413 5 0.354 5 0.137 6 0.3 17 CM_SD 0.12 0.165 2 0.141 2 0.126 3 0.12 7 DiscofConn 0.081 0.111 2 0.95 2 0.085 2 0.081 5 Conn 0.081 0.111 2 0.095 2 0.085 2 0.081 5 OcS 0.072 0.1 1 0.085 2 0.076 2 0.072 4 TcS 0.072 0.1 1 0.085 2 0.076 2 0.072 4 EM 0.06 0.071 1 0.063 2 0.06 4 R 0.06 0.071 1 0.063 2 0.06 4 NAM 0.05 0.052 1 0.05 3 NRM 0.05 0.052 1 0.05 3 S_SD 0.036 0.036 2 C_SD 0.018 0.018 1 Total 72.3% 13 84.6% 17 94.6% 23 100% 59 Selection of leaves for different coverage levels

  12. Leaves names 4th Stage weights 70%coverage nwf /MinNtest 80%coverage/ nwf /MinNtest 90%coverage/ nwf /MinNtest 100%coverage/ nwf /MinNtest SK_SD 0.3 0.413 5 0.354 5 0.137 6 0.3 17 CM_SD 0.12 0.165 2 0.141 2 0.126 3 0.12 7 DiscofConn 0.081 0.111 2 0.95 2 0.085 2 0.081 5 Conn 0.081 0.111 2 0.095 2 0.085 2 0.081 5 OcS 0.072 0.1 1 0.085 2 0.076 2 0.072 4 TcS 0.072 0.1 1 0.085 2 0.076 2 0.072 4 EM 0.06 0.071 1 0.063 2 0.06 4 R 0.06 0.071 1 0.063 2 0.06 4 NAM 0.05 0.052 1 0.05 3 NRM 0.05 0.052 1 0.05 3 S_SD 0.036 0.036 2 C_SD 0.018 0.018 1 Total 72.3% 13 84.6% 17 94.6% 23 100% 59 Selection of leaves for different coverage levels

  13. T I Test case Test case Test case Test case U Test case Test case Test case Test case Test case Test case Test case Test case Test case Test case Test case Test case Obj 1 Obj 2 Obj 3 Method1() Method2() CoWTeSt Method3() Presentation Scheme

  14. Use Interaction Testmethodology • A method to systematicallyderive Integration Test suites from UML diagrams: • Use case diagrams • Sequence (Interaction) diagrams • Class diagrams • Incremental test strategy (over the Use Case organizational structure) • Seven steps involved

  15. Use Case diagram of uci.argo.KERNEL Step 1: UML design analysis and search of relevant Use Cases

  16. Sequence diagram of uci.argo.Kernel.USERMODEL Step 2/1: Analysisof Sequence and Class diagrams involved in the selected Use Case

  17. Designer 1:1..* uses 1:1..* uses Decision Model _decisions DecisionModel() GetDecisions() GoalModel _goals GoalModel() GetGoals() * contains * contains Goal Goal(name:String;priority:int) GetName() GetPriority() Decision Decision(name:String;priority:int) GetName() GetPriority() Class diagram of uci.argo.Kernel.USERMODEL Step 2/2: Class diagram analysis

  18. Step 3Test Units definition • Test Units: those systemunits separately testable for exercising a possible use of system: • Test Units: • DecisionModel, Decision, GoalModel, Goal.

  19. Step 4: Research of Settings and Interactions Categories - SettingsCategories: parameters and inputs relevant in the Test Unit. - InteractionsCategories: objects interactions, exchanged messages. Categories: DecisionModel Settings: _decisions Interactions: DecisionModel() getDecisions() Decision Interactions: Decision(name:String,priority:int) GetName() GetPriority() ............

  20. Step 5: Test Specification construction • For each identified category, we consider all its possible values and constraints. Test Specification: DecisionModel: Settings: Interactions: _decisions getDecisions() Naming Opening a new file Storage Opening a saved file Modularity After a modification and beforesaving Inheritance After a modification and after saving Stereotypes DecisionModel() Relationship.. Class constructor

  21. Step 6: Search of Message Sequences and Test Case definition • Message Sequences: set of Messages (in temporal order) , involved in a SD, used by objects to define and elaborate specific functionalities.

  22. Step 6: Search of Message Sequences and Test Case definition • Test Case: is constructed, from a Test Specification, for each possible combination of choices of every category involved in a Message Sequence Test Case getDecisions() getName()/getPriority() Opening a saved file _decisions Naming Note: Such a Test Case must be repeated for all possible values of _decisions

  23. UseCase Test Suite UserModel Test Frameuci.argo.Kernel UseCase Test Suite Design Critics Use Case Test Suite Kernel UseCase Test Suite ToDoList Step 7: Definition of UseCase Test Suite and Incremental Construction of TestFrame

  24. Cow_Suite: Cow pluS UITEnvironment • Tool for automating • Test cases derivation, from UIT • Test management strategy, from COWTeST • Uses UML diagrams already developed for the analysis and design phase  no additional effort or information. • Rational Rose environment

  25. Cow_Suite Architecture • Uses REI, Rational Rose Extensibily Interface. • Composed by six different components: • MDL Analyser • Test Tree Generator • Weights Manager • Test Case Generator • Test Case Builder • Test Case Checker • Graphics interface: organized in three tabs: • CowTree • UIT Structure • Test Specification

  26. 1.MDL Analyser • Analysis of the .mdl file of Rose • Search the information necessary to apply test selection strategy: • Actors and Use Cases; • Sequence diagrams and their messages; • Classes with attributes and methods; • Activity diagrams • The information is passed as the input to the next component.

  27. 2.Test Tree Generator • All the UCDs and SDs from .mdl file are organized in a hierarchical tree using the explicit diagrams links or using the UC associations and classes relations • To each node we associate: • Level identification number; • Default weight such that the sum of this weight plus the weights associated to all its brothers is equal to 1.

  28. 3.Weights Manager • Interacts with the user for: • Assigning the real weight to each node: • Selecting directly a node • Selecting a level number • Check that the sum of the user assigned weights of a level nodes is equal to 1. • Choosing the criterion for test case selection: • Fixing the maximum number of test cases to be executed • Fixing the percentage of functionalities to be covered.

  29. 4.Test Case Generator • queries the user for the deepest integration level he/she is interested in and calculate the final weight of every leaf. • implements the UIT method for test generation • associates to each SD its test cases organized in a hierarchical manner. • calculates and visualizes the number of test cases for each SD depending on the chosen selection criterion. • associates to each SD the frames of the test cases that should be instantiated.

  30. Test Case Frame TEST CASE 8.2 Description: PreCondition: Test Case 7 Flow of event: getEnterpriseForSourceAddress(caller) [else] routingResult =doLRQ(caller, callee, Enterprise, callcase,AccessAgent, BGAResult) Categories: SettingsCategories: PEnterprise = Callee = BGAResult …… InteractionsCategories: getEnterpriseForSourceAddress = doLRQ = PostCondition: Comment:

  31. 5.Test Case Builder • Interaction with the user for test case implementation: • For the necessary parameters; • Adding and removing categories; • Changing operations value or even test cases structure. • The changes involving the UML design are finally saved in a new file.

  32. 6. Test Case Checker • The tool maintains information about the test cases generation. • This module will realize the comparison of different versions of the same .mdl file. The discovered differences, like, for example, the existence of new test cases or changes in those already generated, are saved into a separate file. • The evaluation of the cost and impact required for the updates to the test plan with respect to the “official version” is derived analyzing this file.

  33. References: • F. Basanieri, A. Bertolino, “A Practical Approach to UML-based Derivation of Integration Tests”, QWE2000, Bruxelles, 20-24 November, 3T. • Basanieri, F., Bertolino, A., Marchetti, E., “CoWTeSt: A Cost Weighed Test Strategy”, accepted for: ESCOM-SCOPE 2001, London, England,2-4 April 2001. • Basanieri, F., Bertolino, A., Marchetti, E., Ribolini, A., “An Automated Test Strategy Based on UML Diagrams” submitted to: ICSE 2001 WAPATV workshop.