1 / 15

MOGENTES 3rd and Final Review Reporting Period January 2010 – March 2011 Cologne, 26 May 2011

MOGENTES 3rd and Final Review Reporting Period January 2010 – March 2011 Cologne, 26 May 2011. Prolan Elpult Demonstrator (WP5/WP6) Peter Lantos, PROL. Content. E valuation based on the Elpult demonstrator Aspects of the evaluation Overview of the demonstrator Layout of the demonstrator

sade-jordan
Télécharger la présentation

MOGENTES 3rd and Final Review Reporting Period January 2010 – March 2011 Cologne, 26 May 2011

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. MOGENTES 3rd and Final Review Reporting Period January 2010 – March 2011 Cologne, 26 May 2011 ProlanElpult Demonstrator (WP5/WP6)Peter Lantos, PROL

  2. Content Evaluation based on the Elpult demonstrator Aspects of the evaluation Overview of the demonstrator Layout of the demonstrator List of the models Test statistics Joint live demo with BME TCG, test run, traceability Summary of the evaluation Discussion

  3. Aspects of the evaluation • Content and complexity of the test cases • Complexity of the test input (complex sequences to activate the internal mechanisms) • Complexity of the test oracle (assessing the output of the tested system) • Quality of the tests • Coverage of code, model elements, and requirements • Trustworthiness of the test cases • Testing effort • Modeling and test case generation (vs. manual test case specification and implementation) • Test execution and assessment of test results • Maintenance of test cases considering corrections, modifications, rerun, and reassessment

  4. Demonstrator layout Goal: TCG for module testing of Objs Is it in accordance with the specification? What is the code coverage of MBT?

  5. Overview of demonstration Behavior Model of Railway Objects (UML statechart) extension extension simulator application model Objs application model code generation, interfaceimplementation test case generation test case generation generated simulator code existing Objs code

  6. Overview of demonstration Behavior Model of Railway Objects (UML statechart) extension extension simulator application model Objs application model code generation, interfaceimplementation test case generation test case generation generated simulator code existing Objs code

  7. Overview of demonstration Behavior Model of Railway Objects (UML statechart) extension extension simulator application model Objs application model code generation, interfaceimplementation test case generation test case generation generated simulator code existing Objs code

  8. Overview of demonstration Behavior Model of Railway Objects (UML statechart) extension extension simulator application model Objs application model code generation, interfaceimplementation test case generation test case generation generated simulator code existing Objs code

  9. Overview of demonstration • Modeling of the Objs module • 6 models / more than 20 versions: • Complex model for the decoding functionality • Sliced/simplified models representing independent decoding of railway object types • Separate models for the safety mechanism and for the communication • Behavior defined by UML state machines + AGSL • Logging the modeling experiences and guidelines Behavior Model of Railway Objects (UML statechart) extension extension simulator application model Objs application model code generation, interfaceimplementation test case generation test case generation generated simulator code existing Objs code

  10. Overview of demonstration • Modeling of the Sim module • 3 models / more than 15 versions : • Similarly to the Objs module • Separate models for each railway object type • Behavior defined by state machines + AGSL • Behavior completed by additional elements that allow complete Java source code generation Behavior Model of Railway Objects (UML statechart) extension extension simulator application model Objs application model code generation, interfaceimplementation test case generation test case generation generated simulator code existing Objs code

  11. Overview of demonstration Behavior Model of Railway Objects (UML statechart) extension extension simulator application model Objs application model code generation, interfaceimplementation test case generation test case generation generated simulator code existing Objs code

  12. Overview of demonstration Behavior Model of Railway Objects (UML statechart) extension extension simulator application model Objs application model code generation, interfaceimplementation test case generation test case generation generated simulator code existing Objs code

  13. Summary of coverage statistics • The decoding functionality (conversion rules) was modeled and involved in TCG • Additional requirements were not modeled • Code coverage (statement and branch) • Manual functional testing: Approx. 30% • Test goals in the UML/UPPAAL track, generated for covering each conversion rule: About 50% • Transition coverage: About 60% • OOAS track: 0.5% more than in the UPPAAL track • Simulator: 90% coverage of the generated code and 70% coverage of the external functions • Twice better coverage than in manual testing!

  14. Evaluation • Significantly more test cases, higher code coverage, more systematic testing • Precisely defined test oracle: Solves the problem of involving independent testers • Trustworthiness of test cases: Solves theproblem of unmotivated testers • Short test maintenance cycles: Solves theproblem of long and inflexible cycles • Effort + efficiency especially in maintenance phase • Fundamental value: Automated test oracle generation • Model based development: Test documentation • State machines: Do not fit to each requirement • Limitations of TCG:Size and complexity of models

  15. Thank you for yourattention!

More Related