1 / 21

Applying Model Based Testing in Different Contexts

Applying Model Based Testing in Different Contexts. Alexander Petrenko Victor Kuliamin ISP RAS, Moscow. Introduction. MBT application success depends on the context of the project Staff Organization Development culture, paradigms, and stereotypes

euclid
Télécharger la présentation

Applying Model Based Testing in Different Contexts

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Applying Model Based Testing inDifferent Contexts Alexander Petrenko Victor Kuliamin ISP RAS, Moscow

  2. Introduction • MBT application success depends on the context of the project • Staff • Organization • Development culture, paradigms, and stereotypes • Can we modify technical aspects of MBT method used to fit the context more? Yes! (at least sometimes…)

  3. Outline • Introduction • Examples • Software component testing – UniTesK • Formal language processor testing – OTK

  4. The Origin • 1994 – 1996ISP RAS – Nortel Networks project onfunctional test suite development for Switch Operating System kernel • A lot of bugs found in the OS kernel, which had been in use for 10 yearsSeveral of them cause cold restart • About 600K lines of code tested by 2000

  5. Proposed Solutions • Software contract specifications for functionality Pre- and postconditions of operationsInvariants of data types • Test adequacy criteria basedon coverage of specifications • Transformation of pre- and postconditions into (IO)FSM, traversal of which guarantees the required coverage level • On-the-fly test generation – construction of a transition tour of the (IO)FSM or other paths

  6. Ch@se Tool

  7. The Whole Picture Testing Model Behavior Model System under Test Coverage Model Single Input Checking Test Input Generation

  8. Tools • J@T (with C++ Link)Java (C++) / NetBeans, Eclipse • CTesKC / Visual Studio 6.0, gcc • Ch@seC# / Visual Studio .NET 7.1

  9. Applications • IPv6 implementations - 2001-2003 • Microsoft Research • Mobile IPv6 (in Windows CE 4.1) • Oktet • Web-based banking client management system • Enterprise application development framework • Billing system • Components of TinyOS http://www.unitesk.com

  10. SUT Application Success Prerequisites • Knowledge source on actual functionality • Documents • Domain experts • Developers, architects • Direct access to the interface under test Tester

  11. Outline • Introduction • Examples • Software component testing – UniTesK • Formal language processor testing – OTK

  12. The Origin • 2001 – 2003ISP RAS – Intel project ontesting correctness of set of optimizers • Several dozens of bugs found

  13. SUT Restrictions • No access to actual functionality – only general processing algorithm descriptions • No access to unit interface –only external compiler interface Tester

  14. Proposed Solutions • Program structure (abstracted of irrelevant elements) considered as test data model • Structured set of data generators • Checking optimizer correctness by comparison with nonoptimized program • Coverage goals • Processing algorithm is taken into account by model • Heuristics

  15. BinaryExpr Variable Constant value : int op : {+,-,*,/} id : string Example: Simple Expressions Test stmts 1..* Statement xi = Expr expr 1 var 1 Expression left 1 right 1 xi 17 Expr1 + Expr2

  16. OTK Tool

  17. Test Data Generation Processing Algorithm Basic Blocks and Connectors Combination Iterator Language Basic Blocks Mapping if(…)…else … for(…;…;…) Test Data (Programs)

  18. Optimizer Correctness Checking Test Program Compiler Compiler Optimizer Co-Testing ? ==

  19. Further Work • The whole compiler testing • Syntax checker • Static semantics checker • Back-end • Formal semantics description needed • Tools integration In work ?

  20. References • V. Kuliamin, A. Petrenko, I. Bourdonov, and A. Kossatchev. UniTesK Test Suite Architecture. Proc. of FME 2002. LNCS 2391, pp. 77-88, Springer-Verlag, 2002. • V. Kuliamin, A. Petrenko, N. Pakoulin, I. Bourdonov, and A. Kossatchev. Integration of Functional and Timed Testing of Real-time and Concurrent Systems. Proc. of PSI 2003. LNCS 2890, pp. 450-461, Springer-Verlag, 2003. • A. Kossatchev, A. Petrenko, S. Zelenov, S. Zelenova. Using Model-Based Approach for Automated Testing of Optimizing Compilers. Proc. Intl. Workshop on Program Undestanding, Gorno-Altaisk, 2003. • V. Kuliamin, A. Petrenko, A. Kossatchev, and I. Burdonov. The UniTesK Approach to Designing Test Suites. Programming and Computer Software, Vol. 29, No. 6 , 2003, pp. 310-322. (Translation from Russian) • S. Zelenov, S. Zelenova, A. Kossatchev, A. Petrenko. Test Generation for Compilers and Other Formal Text Processors. Programming and Computer Software, Vol. 29, No. 2 , 2003, pp. 104-111. (Translation from Russian)

  21. Contacts Alexander K. Petrenko Victor V. Kuliamin petrenko@ispras.ru kuliamin@ispras.ru 109004, B. Kommunisticheskaya, 25 Moscow, Russia Web: http://www.ispras.ru/groups/rv/rv.html Phone: +7-095-9125317 Fax: +7-095-9121524

More Related