1 / 23

GUI Test Automation

GUI Test Automation. Sharing the good, the bad, the ugly & What can we do about it?. What are we going to cover today?. Why Consider Test Automation? Test Automation -- an Investment The Different Sides of Test Automation Forces Affecting Test Automation Three Case Studies

paul2
Télécharger la présentation

GUI Test Automation

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. GUI Test Automation Sharing the good, the bad, the ugly & What can we do about it?

  2. What are we going to cover today? • Why Consider Test Automation? • Test Automation -- an Investment • The Different Sides of Test Automation • Forces Affecting Test Automation • Three Case Studies • The “No-Record-Playback” Approach • An Iterative Approach to Implementation • Be Selective • The Deadliest Pitfalls • Review • QA

  3. Our world today: Frequent changes Frequent builds Frequent deliveries to customer “To catch bugs early, we must test often” What Test Automation can bring: High speed cycle; code-test-analyze-fix-test… Better testing if designed right Repetitive, reliable testing Good coverage for Data-oriented and/or repetitive verification tasks Why Consider Test Automation?

  4. Cost: Licenses Training Consulting Building the library Writing scripts The main cost: MAINTENANCE 95% Benefits: Catch bugs early when they happen… get the Duh effect Get bug ownership Free QA resource for other tasks Forces to write down what we actually test Can expand on it Can avoid the biggest embarrassment Test Automation -- an Investment 1/2

  5. Test Automation -- an Investment 2/2 Our experience: • App A - 210,000 lines of code • App B - 37,000 lines of code • App B again but a different test tool - ??? lines of code The easy traps: • Volume is quality… • Record now, clean up later • “Don’t build/run automated tests now because the GUI is changing” • Automated Testing is enough >>> All a bunch of fairy dust

  6. The Different Sides of Test Automation

  7. Forces Affecting Test Automation “We may not know the details of future changes, but we can guess where it is going to change” • Details of control will change >> attach names, enabled, visible, list… • GUI Behavior will change >> added message, multilingual, new forms… • Different OS, machine speed, network speed, file size • Requirement updates See handout for design strategies

  8. Case Study 1: “Record & Playback” • Demo 1 • Demonstrate brittleness • Demonstrate code duplication • Demonstrate maintainability issues

  9. Case Study 1: “Record & Playback” The reality: • AUT is always on the move • Every time you record you duplicate code (buy now, pay later…) • Every time you duplicate code you create increase your maintenance cost • You’ll have to maintain it in hundreds of places • Pretty soon we will record again to add clicking on a message box … in hundreds of places

  10. “Record & Playback”

  11. Case Study 2: “Encapsulate Recorded Scripts” • Demo 2 • Two methods to limit duplicating code Code approach: • Think of GUI functionality • Can’t keep up with methods names and arguments DB approach: • Move name details to DB

  12. “Encapsulate Recorded Scripts”

  13. Case Study 3:“Development Project” • Demo of an existing system • Demo writing a script

  14. “Development Project”

  15. “Development Project”Business Requirements Business Requirements: • Mimic the User’s Actions • Provide a Scripter Façade • Encapsulate Attach functionalities • Design for maintainability and scalability • Provide timely Error Reporting of Errors • Provide Database-Driven Tests for data and behavioral verifications • Provide a framework to Extend the Test Tool’s Functionality A few good coding practices: • Encapsulate • No Code Duplication

  16. “No-Record-Playback” Approach The golden rules: • Record only to understand how the test tool thinks and how it differentiates one control from another • Then DELETE IT • Never duplicate code • Choose your testing goals wisely The Benefits: • Fully encapsulated code – one place for the Click code, one place with the actual control name… • Easy to upgrade and expand if well designed

  17. Iterative Approach to Implementation • List what needs to be tested >> Test Scenarios at a high level, using an organized thought process (risk, UC, Requirements…) • Earmark what could done through test automation • Prioritize • Set some reasonable goals (like a Build Smoke Test) to validate the automation approach (tool, encapsulation techniques, library design…) • After the first script is running, reconsider your solution

  18. Be Selective in What to Automate 1/2 • Repetitive to boredom • Where we spent most of the time manually • Test Scenario that would catch issues as they happen • Unattended testing

  19. Be Selective… 2/2 • 2 to 5 times the time it takes to perform a test • Some tests are nearly impossible to do by hand • Run script 5 to 20 times before it becomes fully stable • Build script stability with daily builds

  20. The Deadliest Pitfalls • Using Record & Playback • Having no programming experience in the team • Looking at it a back-burner project • Ignoring the fact that you are building a framework • “The test passes because the script ran without errors” • Sloppy baselining practices

  21. Review for Success • Consider building this test automation framework as a development project (requirement, architecture…) with a development team (SME, Automation Engineer, and Tester) • Think reusable components • Pick a small initial goal like automating the smoke test, then a basic Regression System Test • Do not attempt to automate everything, think ROI • Manage timers and synchronization issues in an encapsulated way • Test your test tool

  22. Review • Think maintainability as the AUT will change • Think reusable components (script / library) • Think scalability: use a library architecture to easily adapt to other applications being tested and other environments • Get some developer involved at the get go • Address synchronization problems early on • Balance cool & productive features of the test tool • Build a balanced team: SME, Automation Engineer, and Tester

  23. The End • More material: • Expected App changes and what to do about it? • Functional requirements built on experience • Web sites • Fit and FitNesse • …

More Related