1 / 17

A lightweight framework for testing database applications

A lightweight framework for testing database applications. Joe Tang Eric Lo Hong Kong Polytechnic University. Our focus. System testing (or black-box testing) A database application  its correctness/behavior depends on The application code + The information in the database.

nola-finch
Télécharger la présentation

A lightweight framework for testing database applications

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. A lightweight framework for testing database applications Joe Tang Eric Lo Hong Kong Polytechnic University

  2. Our focus • System testing (or black-box testing) • A database application its correctness/behavior depends on • The application code + • The information in the database

  3. How to test a database application? • Test preparation: • In a particular “correct” release • A tester ‘plays’ the system and the sequence of actions (e.g., clicks) is recorded as a test trace/case T • E.g., T1: A user queries all product • E.g., T2: A user add a product • The output of the system is recorded as the “expected results” of that trace • For database applications, “the output of the system” depends on the database content • A test trace may modify the database content • For ease of managing multiple test traces, we reset the database content at the beginning of recording each test trace

  4. How to test a database application? • Test execution: • For each test trace T • Reset the database content • Runs the sequence of actions (e.g., clicks) recorded in T • Match the system output with the expected output • Problem: Resetting the DB content is expensive • Involves content recovery + log cleaning + thread resetting [ICSE 04] • About 1-2 minutes for each reset • If 1000 test traces  2000 minutes (33 hours)

  5. DBA testing optimization • Test automation • Execute the test traces (and DB resets) automatically (vs. manually one-by-one) • Test execution strategies • Test optimization algorithms 2 + 3  aims to minimize the number of DB resets

  6. Related work 2. Test execution strategies • Optimistic [vldbj]: Execute reset lazily • T1 T2 T3 R T3 3. Test optimization algorithms • SLICE Algorithm [vldbj]: • If T1 T2 R this time • Next time we try T2 T1 …

  7. Problems 2. Test execution strategies • Optimistic [vldbj]: Execute reset lazily • T1 T2 T3 R T3 • May introduce false positives • E.g., T2 covers a bug but it says nothing! 3. Test optimization algorithms • SLICE Algorithm [vldbj]: • If T1 T2 R this time • Next time we try T2 T1 …

  8. Problems 2. Test execution strategies • Optimistic [vldbj]: Execute reset lazily • T1 T2 T3 R T3 • May introduce false positives • E.g., T2 covers a bug but it says nothing! 3. Test optimization algorithms • SLICE Algorithm [vldbj]: • If T1 T2 R this time • Next time we try T2 T1 … • Large overhead  keep swapping info • Get worse when +/- test traces

  9. This paper • Test execution strategy • SAFE-OPTIMISTIC • No false positives • Test optimization algorithm • SLICE* • No overhead • Comparable performance to SLICE • Better than SLICE when +/- test traces

  10. Test execution strategySAFE-OPTIMISTIC • Also “execute resets lazily” • Test preparation: • Record not only the system output • + query results • Test execution • Match not only the system output • + Match query results

  11. Implementation

  12. Test optimization algorithmSLICE* algorithm • Collection of “slices” • If T1 T2 T3 R T3 T4 T5 • Then we know <T1 T2> and <T3 T4 T5> are good • Next time: swap the slices, and thus try: • T3 T4 T5 T1 T2

  13. Evaluation • A real world case study • An on-line procurement system • Test database  1.5GB • A database reset 1.9 min • Synthetic experiments • Vary the number of test cases • Vary the degree of “conflicts” between test cases • Vary % of update in the test suite

  14. Real case study

  15. 1000 test traces, 100K conflicts

  16. Conclusion • SLICE* and SAFE-OPTIMISTIC • Run tests on database applications • Efficiently • Safe (no false positives) • Able to deal with test suite update

  17. References • [vldbj]Florian Haftmann, Donald Kossmann, Eric Lo: A framework for efficient regression tests on database applications. VLDB J. 16(1): 145-164 (2007) • [ICSE04]R. Chatterjee, G. Arun, S. Agarwal, B. Speckhard, and R. Vasudevan. Using data versioning in database application development. ICSE 2004.

More Related