1 / 29

Software Test Automation Texas Instruments Software Development Organization

Software Test Automation Texas Instruments Software Development Organization. September 2009. Agenda. Goals of Common SDO test automation effort. System Architecture / Layers Test Setup Test Plans Roadmap. 1) Goals. Goal of SDO Automation Effort.

hedia
Télécharger la présentation

Software Test Automation Texas Instruments Software Development Organization

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Software Test Automation Texas InstrumentsSoftware Development Organization September 2009

  2. Agenda • Goals of Common SDO test automation effort. • System Architecture / Layers • Test Setup • Test Plans • Roadmap

  3. 1) Goals

  4. Goal of SDO Automation Effort To establish a common integration and test infrastructure for development and test teams to use across SDO that leveragesopen source components To enable customers, partners, and community to run regression tests and contribute tests

  5. Goals and Solutions • Goal: Share same test suites/test cases across sites • Solution: • Use open source test management system (Test Link) to store test suites. • Use single test database at all sites • Automate the test execution of most test suites/test cases. • Goal: Open architecture to support multiple execution engines • Solution: • Design a system that allows using different types of execution engines to suit user needs. • The execution engine type is part of the test case definition. • Goal: Share hardware resources • Solution: • Implement a common Resource Manager that allocates resources as needed • Goal: Enable more people to run tests • Solution: • Implement a web-based test management system that allows people to trigger common tests and get results at local or remote locations. • Goal: Facilitate Continuous Integration & Test • Solution: • Incorporate Build Execution Engines (BEE) into the test automation architecture so a Continuous-Integration system can be implemented. • Code can be built and tested by developers and testers as often as desired and at any point in the development cycle. • Goal: Maximize Open Source Software usage • Solution: • Leverage existing open source tools as much as possible • The new test architecture uses TestLink and Staf.

  6. 2) System Architecture

  7. High-Level Test Architecture Test Management System Test Management System Test Artifacts (Mysql) Test Artifacts (Mysql) Test Master Controller Test Master Controller Service Provider Test Execution Engine Test Execution Engine Service Provider Test Execution Engine Build Execution Engine Service Provider Build Execution Engine Test Execution Engine Test Execution Engine Service Provider Test Execution Engine • Create and Manage Test Cases & Organize them into Test Plans. • Request Test Job Execution. Site A Site B IP Network • Manage Resources. • Dispatch & Monitor Jobs. • Execute Build & Test Jobs. • Test Artifacts Databases must be synchronized on a periodic basis • The 3rd layer is the service-providers layer. TEE and BEE are just special Service Providers. • It should be possible to reuse Service Providers across sites if/when desired • Typically 1 TMS : 1 TMC • 1 TMC : Many Service Providers

  8. Test Architecture: 3 Functional Layers 1) Test Management System. 2) Test Master Controller. 3) Service Providers Test Management System Test Requirements 1 TestLink (GPL) + Staf http://www.teamst.org/ n Test Projects Test Cases Web Front-End TI Custom Fields* * Custom fields= Test Execution Engine, Test Execution Logic, Test Parameters, HW Assets, SW Assets Test Plans n Test Reports Data in XML Test Master Controller Staf (EPL) http://staf.sourceforge.net/ 1 Dispatcher Service Monitor Service Resource Management Service Data adapted to TEE Data adapted to BEE Build Execution Engine Test Execution Engine Staf + Vatf|Stax|iCOTS|TestRunner|Other Execution Logic Execution Logic … Equipment Drivers …

  9. Test Management Layer: Test Link Test Link is an open source test management tool • It uses a MySQL database to hold test cases / test suites / test plans • It runs on a server, and requires no proprietary tools to run • It uses a browser as the user interface • It does not natively have good support for automated test cases, but we are enhancing its automation support by integrating it with STAF

  10. Test Master Controller Layer: Staf Staf is an open source, multi-platform, multi-language framework designed around the idea of reusable components. • It allows Staf processes on different machines to communicate & initiate processes on each other We will use it for: • Resource management • Dispatching test requests for execution • Tracking test execution • Communication between layers of the system

  11. Service Providers: TEE / BEE Layer The service providers layer creates builds and runs tests Initial implementation: • Test Engine: VATF (Home-grown Test Execution Engine) • Build Engine: File BEE (provides pre-built image) Arago BEE (builds image using Arago/OE infrastructure) Staf client will run on test execution systems to interface the TEE and BEE components to the TMC layer, and to perform resource registration / status

  12. 3) Test Setups

  13. High-Level Test Setup

  14. TEE Test Setup details (USB)

  15. TEE Test Setup Details (OVQ)

  16. 4) Test Plans

  17. Test Plans • TI will offer Test Plans for individual components (i.e. PSP) and for integrated systems/solutions (i.e. DVSDK). • TI will provide Test plans for different test scopes: • L1: Level 1 Sanity (a.k.a Smoke) tests focus on verifying that a feature or component is operational. Each L1 test should take at most few minutes to complete and the whole L1 test plan should not take more than 12 hours to run. L1 test plan should be run as frequently as possible. • L2: Level 2 tests focus on complete functional validation. The L2 Test plan may take few days to run and may involved manual testing. • L3: Level 3 tests focus on measuring performance, stressing the system and exercising system use case scenarios. The L3 test plan may take few weeks to run and most likely required some manual testing.

  18. Test Plans cont. • TI has test plans for many areas as shown in the next two slides, however they are not yet in TestLink format. TI is currently in the process of migrating existing test plans to TestLink. • After migrating the test plans to TestLink, we will benefit from some of the features of the new system such as reuse same test logic at different locations, hardware resource management and continuous testing of software built using Arago. • Existing automated test plans focus primarily on L1 and L2 testing. TI is working on adding more L3 tests as well as incorporating new open source test tools into the automated test plans.

  19. ATA Audio ALSA, OSS CIR EDMA Ethernet GPIO I2C Keyscan MMC SD, SDHC, SDIO NAND NOR RTC SPI Timer UART Uboot USB Host MSC, ISO, HID USB Slave MSC, CDC/RNDIS Video Fbdev Display Video V4L2 Display Video V4L2 Capture Video H3A Video Previewer/Resizer Video Face Detection WDT PSP Test Areas

  20. Applications: Demos DVTB Gstreamer Codecs: H.264 encoder H.264 decoder MPEG4 encoder MPEG4 decoder JPEG encoder JPEG decoder AAC encoder AAC decoder G711 encoder G711 decoder G729 encoder G729 decoder MP3 decoder DVSDK Test Areas

  21. Test Equipment & Drivers • Drivers for the following Hardware has already been written and validated: • Generic Linux host driver (DM355, DM357, DM365, DM644x, DM6467, etc.) • APC Power Controller • Extron Video/Audio Switches • Extron USB Switches • Tascam DVD players • Pionner DVD players • Q-Master's Video Quality System • Video Clarity's Video Quality System • Generic PC Sound Card • Spectra Lab's Audio Quality System • Opera’s PESQ Speech Quality System. • There are also some second-tier drivers that reuse HW drivers and add extra functionality • DVSDK DVTB for multiple DVSDK releases • DVSDK Demos for multiple DVSDK releases • Windows & Linux USB Hosts

  22. 5) Roadmap

  23. Roadmap • Beta: December 2009 • Internal and selected External users are able to setup the system and run existing Test Plans using ‘high-end’ test equipment. • Open source Git Project available • Contributions limited to internal and possibly a few external users. • Installation Guide available. • Users Guide available. • GA: March 2010 • Users are able to develop new test plans on their own and contribute to the open source project. • Contributions open to all • Developers guide available. • Developers how-to videos available. • Support for less-expensive test equipment, so it is easier for anybody to run tests on their setup.

  24. Backup Slides

  25. Steps to create automated Test Plan • Select Test Equipment to use (if any) • Select BEE & TEE to use • Define Test Parameters • Write Test Equipment driver (if one doesn’t exist) • Write Test Script for selected TEE • Add new Test Cases/Test Suite to TestLink project. Each test case identifies: • Required Software Assets • Required Hardware Assets • Test Parameters (DUT, Test Equipment, Application) • Test Execution Engine type (i.e. VATF, STAX) • Test Script An easy way to remember the information that goes into the test case definitions is by remembering the word SHAPES

  26. TMC’s Resource Manager TMS/User Request 2 TMC’s StafProc 3 DISPATCHER RESMGR 4 7 5 6 1 8 TEE • TEE Register w/ RESMGR and provides its type and capabilities upon startup. It also registers with the dispatcher and provides xslt or exec file to transform the test data. • Test Request send to TMC • DISPATCHER query RESMGR for available TEE type w/ required HW_ASSETS (capabilities) • RESMGR provides TEE machine name • DISPATCHER send test request to TEE machine gotten on step 4. • TEE completes job execution and notifies DIPATCHER. • DISPATCHER releases TEE • TEE unregister w/ RESMGR upon shutdown

  27. Test Management System’s Services • TestLink Basic Definitions: • Test Project holds together the multiple test artifacts (test cases, test requirements, test plans, etc.) • Test Case describes a testing task and expected results. Test Cases are the fundamental piece of TestLink. • Test Suite is a collection of Test Cases. It organizes Test Cases to units. • Test Build is the software to be tested. It is part of the Test Plan. • Test Plan is created when you'd like to execute Test Cases. Test Plans can be made up of the Test Cases from the current Test Project. • Main Services provided by the TMS: • Store all Test Artifacts • Provides ways to Add/Edit/Delete Test Artifacts: • Test Cases • Test Suites • Test Requirements • Test Builds • Test Plans (Test Cases + Build Tag) • Test Results • Export/Import Test Plans to/from XML • Trigger test execution (optionally)

  28. Test Master Controller’s Services • Dispatch Jobs to appropriate BEE & TEE • Translate Test parameters between TMS and BEE/TEE • Provide TMS-independent test case representation to TEE • Provide a mechanism by which TEE can save test results in TMS • Manage Service Providers

  29. Service Providers’ Services • At least there will be two types of service providers: • BEE: Build Execution Engines to build the required software assets. • TEE: Test Execution Engines to run the test cases. The TEE maybe as simple as a shell script or as complex as VATF. • BEE & TEE are the only service providers that the TMC Dispatcher directly request services from. • Complex TEE, such as VATF, will typically perform these tasks: • Initialize HW assets (both Device Under Test and Test Equipment) • Create Test Parameters representation • Run Test Execution Logic • Load Device Under Test (DUT) w/ appropriate SW assets • Boot DUT • Connect DUT to Test Equipment • Configure DUT and Test Equipment • Check conditions or measure metrics • Determine Pass/Fail • Save Results & Logs

More Related