1 / 48

Yves Le Traon - Clémentine NEBUT

Automatic test generation from system requirements and application to software product lines Génération automatique de tests à partir des exigences et application aux lignes de produits logicielles. Yves Le Traon - Clémentine NEBUT. Outline. Introduction Requirement models Test generation

Angelica
Télécharger la présentation

Yves Le Traon - Clémentine NEBUT

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Automatic test generation from system requirements and application to software product linesGénération automatique de tests à partir des exigences et application aux lignes de produits logicielles Yves Le Traon - Clémentine NEBUT

  2. Outline • Introduction • Requirement models • Test generation • Conclusions and future work Introduction

  3. Modern software issues • Software size grows up • testing methods have to adapt to the change of scale • System requirements ... • … evolve very often • Nokia : 69% of the requirements modified, 22% modified twice • need to build quickly new tests from the new requirements • … are in natural language • need of a formalization to apply automatic test generation techniques • Product line concept • Common requirements + variations • new testing methods Introduction

  4. Testing product lines Classical approach Test(P1) Test(Pn) • Cost of the test generationlinear in n • Cost of a new product: heavy Pn P1 n products Ideal approach test • make the cost of the test generation decrease • factorize testing task • rapid validation of a new product test test test test + test + test + test + test Introduction

  5. Make the cost of the system test generation decrease: automatic test generation from the requirements Constraints: 1. complexity (size) of real software 2. compatibility with usual industrial practice 3. adaptability to the product lines context Our objectives Introduction

  6. Automatic test generation:classical functional approaches Requirements Beh. model Test cases Paths extraction Oracle ? Feasibility ? LTS, IOLTS, FSM, EFSM, ... Z, B, SDL,... Coverage criteria Categoriesidentification Partitionsidentification Testselection Categories Choices Test cases • Using behavioral models • Category-Partition approaches Introduction All transitions, all nodes, ... 1. Complexity ? 2. Industrial practice ? 3. Product lines context ?

  7. Test generation from UML use cases « Use cases are a means for specifying required usages of a system ». OMG, UML 2.0. • Fröhlich et al (2000) • from cockburn-formatted use cases to state machines • Ryser et al (2000) • from use cases to statecharts, dependency charts • Riebish et al (2002) • statistical testing • Basanieri et al (2002) • cow_suite approach • Briand et al (2002) • TOTEM approach Introduction

  8. Existing approaches to test product lines • Scarce ! • Few automatic approaches • From use cases: • a systematic approach (Kamsties et al.) • PLUTO, an approach based on category/partition (Bertolino et al.) Introduction

  9. Overview of the proposed approach requirement 1.1 "Register a book" the "book" becomes "registered" after the "librarian" did "register" the "book". the "book" is "available" after the "librarian" did "register" the "book". Requirements Requirement model simulation [connect(p1), plan(p1,m1)] [connect(p1), plan(p1,m1), open(p1,m1), close(p1,m1)] Test objectives Test scenarios Test cases Introduction

  10. Outline Requirements Requirement model Test objectives Test scenarios Test cases • Introduction • Requirement models • Test generation • Conclusions and future work Requirements models

  11. Enhanced use case model VirtualMtg enter plan consult A manager open manager leave user A meeting close speak The meeting existsand is organizedby the manager Param The meeting doesnot already exist hand over Plan moderator UC1 Pre Post connect • Enhancement of the UML use case model: • Parameters • actors or business concepts • represent the involved entities • Contracts • precondition and postcondition • handling the parameters Requirements models

  12. A use case contract language p:participant m:meeting planned(m) andmanager(p,m) not planned(m) Plan • First order logic expressions • Boolean properties (predicates) = name+typed parameters • Ex: planned(m:meeting) manager(u:participant,m:meeting) • Enumerated properties • Classical boolean operators (and, or, implies, not) • Quantifiers (forall, exists) • Benefits: • formalization of the use cases • dependencies between the use cases can be deduced Requirements models

  13. Deducing dependencies: example #use case OPEN UC open(u:participant;m:mtg) pre created(m) and moderator(u,m) and not closed(m) andnot opened(m) and connected(u) post opened(m) Requirements models #use case CLOSE UC close(u:participant; m:mtg) pre opened(m) and moderator(u,m) post... OPEN(u1,m1);CLOSE(u1,m1) is a correct sequence

  14. Dealing with requirementsin controlled natural language requirement 1.1 "Register a book" the "book" becomes "registered" after the "librarian" did "register" the "book". Requirements • A requirement description language (RDL) • Semantics given in terms of enhanced use cases • Benefits • compatible with the industrial practice • formalization of the natural language requirements • better traceability requirements / analysis models / tests Requirements models Requirement model

  15. Test generation Requirements simulation Requirement model Test objectives Test scenarios Test cases • Introduction • Requirement models • Test generation • Conclusions and future work Test generation

  16. From requirements to test objectives Requirements simulation Requirement model Test objectives Test scenarios Test cases • Introduction • Requirement models • Test generation • Conclusions and future work Test generation

  17. Definition of the simulation state Need of: initial state entities involved in the simulation Simulation of a use case model ? Plan(p1,m1) connected(p1), connected(p2), planned(m1),manager(p1,m1) • Applying an instantiated use case: • checking the precondition • modifying the current state connected(p1), connected(p2) Test generation p1,p2:participant m1,m2:meeting Plan(p:participant, m:meeting) pre not planned(m) and connected(p) post planned(m) and manager(p,m)

  18. The Use Case Transition System(UCTS) connected(p1), created(m1), manager(p1,m1), moderator(p1,m1) Test generation open(p1,m1) connected(p1), created(m1), manager(p1,m1),moderator(p1,m1)opened(m1) enter(p1,m1) close(p1,m1) connected(p1), created(m1), manager(p1,m1), moderator(p1,m1)opened(m1), entered(p1,m1) connected(p1), created(m1), manager(p1,m1), moderator(p1,m1)closed(m1) close(p1,m1)

  19. Principles of the test objective generation Test objectives {UC1(p1,p2), UC3(p2),UC4(p1)} {UC3(p1),UC1(p2,p2)} … • Test objective = path of the UCTS = sequence of instantiated use cases • Generating test objectives • Extracting short paths in the UCTS • Extracting a « reasonable » number of paths • Test criteria • 4 structural criteria • 1 semantic criterion • 1 robustness criterion Test generation UCTS Testcriteria

  20. 4 structural test criteria • All transitions • All nodes • All instantiated use cases • All instantiated use cases and all nodes C() Test generation N1 A(x1) A(x2) B(x1,y1) N2 N3 C() C() B(x2,y1) A(x2) A(x1) B(x2,y1) N4 N5 C() C() B(x1,y1) • A(x:X) • B(x:X,y:Y) • C() x1,x2:X y1:Y

  21. A semantic test criterion B(x1) {P(x1),Q(x2)} A(x1) B(x2) {Q(x1),Q(x2)} {P(x1)} UC A(x:X) pre P(x) post not P(x) and Q(x) UC B(x:X) pre P(x) or Q(x) post not Q(x) B(x2) B(x1) B(x1) A(x1) x1,x2:X {Q(x2)} {Q(x1)} B(x2) B(x1) {} • All precondition terms • Applying a use case in several different configurations making its precondition true Test generation •  3 configurations • not P(x) and Q(x) • P(x) and not Q(x) • P(x) and Q(x)

  22. A robustness criterion . . . Correct path Non specified action • Generate paths leading to an invalid application of the use case • Criterion • similar to all precondition terms (all the configurations that make the precondition fail) Test generation

  23. From test objectives to test scenarios Requirements Requirement model Test objectives Test scenarios Test cases Test generation simulation

  24. A gap to bridge ... [connect(p1),plan(p1,m1)] ? Test generation

  25. Assumptions on the scenarios guards Use caseparameters Scenariosparameters Effect onthe system Test generation {Nominal} {Exceptional}

  26. Principles of the test scenarios generation SCA1 UCA SCA2 Tests scenarios SCB1 UCB ISCA1 ISCA2 ISCA1 ISCA2 ISCB1 ISCB1 ISCB2 ISCB2 SCB2 = Strong sequential composition Test generation Test objective Test scenario • Substitution of instantiated use cases by instantiated scenarios • Sequential composition • Cartesian product of sets of scenarios IUCA substitution IUCB

  27. Analysis of the verdicts Initialization(test data) Use cases Scenarios Generation Test scenarios / test cases Execution Pass Verdict Fail Verdict Inconclusive Verdict Errordetected Test generation

  28. Verdict Fail: example Violated => Error detected :ATM p1:user Deposit(account1, amount1=40) Constraint:account.amount> amount p1:user :ATM Withdraw(account1, amount2=30) Assert:account.amount>0

  29. Analysis of the verdicts Initialization(test data) Use cases Scenarios Generation Test scenarios / test cases Execution Pass Verdict Fail Verdict Inconclusive Verdict Non-executed testscenario detected Errordetected Test generation

  30. Verdict Inconclusive: example Violated=> inconclusive :ATM Deposit(account1, amount3=20) Constraint:account.amount> amount :ATM Withdraw(account1, amount2=30) Assert:account.amount>0

  31. Analysis of the verdicts Initialization(test data) Use cases Scenarios Generation Test scenarios / test cases Execution Pass Verdict Fail Verdict Inconclusive Verdict [ uncovereduse case scenarios] Non-executed testscenario detected Errordetected Test generation

  32. From test scenarios to test cases in a RequirementDescription Language Requirements an enhanced use case model simulation Requirement model based on simulation, using test criteria Test objectives possible use of test synthesis tools Test scenarios Test cases Test generation

  33. General case:Test scenario  test case Condition : incomplete use case scenarios parameters wildcards lack of certain messages For realistic systems and product lines Particular case: Test scenario = test case Condition : complete use case scenarios For simple systems Test scenarios and test cases Test generation Need to treat the test scenarios to obtain test cases Test synthesis tools

  34. Test synthesis:using the UMLAUT/TGV toolkit UMLSpec. Sequencediagram ?e !a !a LTS LTS !d !d !c ?b ?e ?b Test synthesis !a Test case(LTS) !c ?b Test case(UML) • UMLAUT / TGV • On the fly test synthesis • Generation ofBehavioral Test Patterns • positive scenario • negative scenarios • prefix scenario Test generation

  35. Test synthesis usingbehavioral test patterns nominal exceptional UC1 Use cases nominal exceptional UC2 Selection Positive scenario Negative scenarios (optional) Prefix scenarios (optional) General Design Main classes Interfaces… BehavioralTest Patterns specification Test cases synthesis behTP1 behTP2 Detailed Design Evolution • UML model: • class diagram • state machines • object diagram Test generation

  36. Experiments and Results Experiments with TAS Requirements Requirement model Academical experiments Test objectives Applicationto the PL Test scenarios Ongoing integration with the AGATHA tool Test cases Experiments and results

  37. 3 case studies FTP client ATM Virtual meeting Code repartition Code Coverage Criteria comparison Academic experiments Code repartition Code coverage for the virtual meeting example Dead code9% Robustness code w.r.t. env 18% # cov. statements # test cases Robustness code w.r.t. spec 8% Nominal code 65 % Code covered with APT + robustness criterion Experiments and results

  38. Experiments with TAS • Two components of weapon navigation system (Mirage 2000-9 and Rafale) • Translation of the requirements from English to RDL • Use of the simulator: • completion of implicit parts of the requirements Experiments and results • cannot be translated(arithmetic, real-time) 20% 10% could be translated(limit of the prototype tool) 70% translated

  39. Application to the Product Lines context • 3 types of variation points • 0 or 1 variant = optional • 1 out of n variants = choice (xor) • m out of n variants = multiple choice (or) • What can vary in the use case model ? • use cases • parameters • contracts • scenarios Experiments and results

  40. Variation in use cases models • How to represent variation ? • Tagging the variant model elements • VP_name{variant_list} • Examples : • UC Record (p:participant, m:meeting) {VP_Recording{true}} • UC enter(u:participant;m:meeting) pre connected(u) and opened(m) pre private(m) implies authorized(u,m){VPMeetingType(private)} post entered(u,m) Experiments and results Alternative variation point Multiple variation point

  41. Scenarios in the Product Lines context Test cases synthesis behTP1 behTP2 P1 P2 P3 • Scenarios for product lines: • must be generic • use of parameters • use of wildcards • omit certain details • necessity of the test synthesis step Experiments and results

  42. Outline • Introduction • Requirement models • Test generation • Conclusions and future work Conclusion

  43. Cutting down the test generation cost Adapted to the product lines context Compatible with usual industrial practice Facing thecomplexity (size)of real software Conclusions and benefits Requirements Conclusion Improvementof therequirements Requirement model Using only early requirements Test objectives Using detailed requirements Scenarios Test scenarios As soon as a UML model is available UML model Test cases

  44. Conclusion • Testing tools • simulator, test generator • Industrial interactions • Dissemination • ASE, ISSRE, ... Conclusion Collaboration between THALES, the CEA and the INRIA European projects

  45. Future work • Improving the efficiency of the generated tests • new criteria taking into account the constraints of the scenarios • test data • Other ways to model the variability • Extra-functional requirements • models ? • test generation ? • Traceability issues in model-driven approaches • traceability models • traceability from requirements to tests Conclusion

  46. All the metamodels are based on the MOF Objectives: models built by transformation models homogeneous management of traceability Model transformations and traceability Requirements metamodel requirement 1.1 "Register a book" the "book" becomes "registered" after the "librarian" did "register" the "book". the "book" is "available" after the "librarian" did "register" the "book". Test objectives metamodel [connect(p1), plan(p1,m1)] [connect(p1), plan(p1,m1), open(p1,m1), close(p1,m1)] UCTS metamodel System static metamodel Test scenarios metamodel Test cases metamodel Configuration metamodel Conclusion Enhanced Use case metamodel

  47. Questions Questions

  48. ...

More Related