1 / 24

Deriving State-Based Test Oracles for Conformance Testing

Deriving State-Based Test Oracles for Conformance Testing. Jamie Andrews Associate Professor Department of Computer Science University of Western Ontario London, Ontario. Plan for Talk. Testing background Log file analysis (LFA) Process for developing artifacts Refinement of process.

louie
Télécharger la présentation

Deriving State-Based Test Oracles for Conformance Testing

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Deriving State-Based Test Oracles for Conformance Testing Jamie Andrews Associate Professor Department of Computer Science University of Western Ontario London, Ontario

  2. Plan for Talk • Testing background • Log file analysis (LFA) • Process for developing artifacts • Refinement of process

  3. Testing Background • 3 main tasks to testing: • Selecting test inputs • Running tests • Checking test outputs • Checking test output not always trivial Test Input Software Under Test Test Output

  4. Checking Test Output • May be too complex to check visually • May be legitimately different from output of previous version Test Input Software Under Test Test Output

  5. Test Oracles Test Input • Programs that check the output of other programs Software Under Test Test Output Test Oracle Pass/Fail

  6. Test Oracles Test Input • Input and output may be difficult to capture • Oracle may have to parse complex I/O Software Under Test Test Output Test Oracle Pass/Fail

  7. Log File Analysis (LFA) Test Input • Log file: simple text output • LFA: dynamic analysis for conformance checking Software Under Test Log File Test Output Log File Analyzer Pass/Fail

  8. LFA Challenges • Make sure logging doesn’t slow down system too much • Can send logging data to another machine • Decide on logging policy for software • Write log file analyzer program • Special-purpose state machine-based language to help with this

  9. Log File Analysis Language (LFAL) • Analyzer = collection of state machines • Each machine notices some log file lines, ignores others • Log file lines trigger transitions • Machine reports error if it: • Notices a line • Does not have a legal transition on it

  10. Processes • Need a process for getting • From requirements • To logging instrumentation, analyzer • “Big-step” process: • Used and taught to students • “Small-step” process: • Suggested refinement

  11. Example: Elevator System • Requirement to check: • “The doors are never open when the elevator is in motion.”

  12. Big-Step Process Requirements Situations with Permitted and Forbidden Events SPFEs Log File Analyzer Program Logging Policy

  13. Example: SPFEs • SPFE1: • Situation: Elevator door is open • Permitted event: Door closes • Forbidden event: Elevator starts moving • SPFE2: • Situation: Elevator is moving • Permitted event: Elevator stops moving • Forbidden event: Door opens

  14. Relationships • SPFEs should re-express requirements to be checked • Logging policy should specify that we log all events that allow us to determine: • Whether we are in each Situation • Whether each Permitted/Forbidden Event has happened

  15. Example: Logging Policy • Log all door open / close events in the form door_open, door_close • Log all elevator move / stop events in the form start_move, stop • Can instrument code based on this

  16. From SPFEs to Analyzer • Situations correspond to states • Permitted events correspond to transitions • Forbidden events should not have any corresponding transition

  17. Example: LFAL Analyzer moving start_move closed_stopped stop door_open open door_close machine door_safety; initial_state closed_stopped; from closed_stopped, on start_move, to moving; from moving, on stop, to closed_stopped; from closed_stopped, on door_open, to open; from open, on door_close, to closed_stopped; final_state Any.

  18. Some Past Projects • Steam boiler simulator and analyzer (X. An) • WAP client development and testing (V. Liu) • 5 KLOC; 3 protocol layers verified • Test case generation from oracles (R. Fu)

  19. Problems with Big-Step Process • Not always explicit: • Which requirements are to be checked • Under what assumptions/conditions • Sometimes “abstract” events mentioned in SPFEs cannot be directly logged • e.g. “door open” event may actually correspond to “send release command to door lock actuator” • Need more concrete, loggable events

  20. Small-Step Process Requirements to be Checked Checking Assumptions SPFEs Abstract Events Concrete Events Log File Analyzer Program Logging Policy

  21. Future Work • Case studies of small-step process • Teaching small-step process • Experiments

  22. Potential Benefits and Problems • Benefits: improved reliability, flexibility, scalability, traceability • Problems: • False negatives and positives • Instrumentation maintenance • Process weight

  23. Summary • LFA is a method for test result checking • We have used and taught a process for applying it • We propose a refined process for future work

  24. Frequently Asked Questions • Do we have to discard regression testing? • No; can use as a complement • How do we know what to put in log file? • Recommend identify reqs to check with LFA first • Develop logging policy, log file analyzer from those • Efficiency? Scalability? • A few years ago processed 1000 lines/sec • Biggest analyzer 1200 NLOC, from 10-page spec • Recommend starting small, seeing if works for you

More Related