190 likes | 213 Vues
This guide covers test execution from unit to user acceptance testing, defect tracking, automated tools, test program tracking, and metrics analysis. Learn phases of testing, white and black box testing, and the importance of test results analysis.
E N D
Automated Software Testing: Test Execution and Review Amritha Muralidharan (axm16u)
Contents • What is test execution? • Phases of Testing • Unit Testing • Integration Testing • System Testing • Test Results Analysis of Regression Tests • User Acceptance Testing • Defect Tracking and Automated Tracking tool • Test Program Status Tracking and EVMS • Test Metrics Collection and Analysis • White Box • Black Box • Test Execution Review
What Is Test Execution? • Stage 5 of the ATLM • Creation of testing environment • Tests are carried out on the application • Input = test procedures • Output = satisfied/altered acceptance criteria • Software Problem Reporting • E.g. CourseMarker
Phases of Testing • Unit, Integration, System and User Acceptance testing. • Tests executed according to the Test Plan for each phase.
Phases of Testing: Unit Testing • Consistent with Development Schedule • Automated Results Checkout Process • White Box Level • Involves detailed look at code • Best to execute unit tests as soon after the code has been created. • Static Analysis • Code Profiling • Development Test Tools • Test Documentation • Metric Collection • Process Improvement • Software development folders (SDF) and Unit Test Evaluation Criteria
Phases of Testing: Integration Testing • “Units that are incrementally integrated are tested together based on control flow” • Some integration testing takes place during unit testing • Test procedures based on integration test design addressed earlier in the ATLM • Results are analyzed and documented • Refining Test Procedures • Some test scripts can be automated for re-use in system testing • End-user approval = end of unit and integration testing
Phases of Testing: System Testing • Higher Level • Performed by a separate test team • May require many individual test procedures • Test Execution • False Negatives • False Positives • Software Problem Report
Test Results/Analysis of Regression Tests • New application baseline • Build release notes should accompany the new build • Smoke Tests • Regression Testing (after defects have been fixed/modified) • Investigate high risk areas of code • Indicator of developer behaviour • Test team investigates areas with larger number of problem reports • Analysis is a clear indicator of whether test execution procedures are useful in locating errors. • Where more test effort is required • System testing is complete when criteria have been met
User Acceptance Testing • Involves End-user participation • Consists of the subset of the suite of tests performed at system level • Defects at UAT level will be documented in SPRs • Satisfactory resolutions of level 1 and 2 SPRs = conclusion to testing
Defect Tracking and New Build Process • Test team performs problem reporting operations in compliance with a defined process • SPR creation • E-mail Notification to configuration management(CM) and developers • New software is checked via CM tools • New software build issuance • Defect Priority • Fatal, high, medium, low • Engineering review Board (ERB) • Automated Defect Tracking Tool • Ensures reported defects receive proper attention • Central defect tracking repository • Has several basic characteristics • Automated Test Tools • Maintain test results/permit automatic generation of test results • Recommended that SPRs are generated per life-cycle stage
Test Program Status Tracking and EVMS • Helps to efficiently handle problems that arise during test effort • Test engineer must produce meaningful reports based on test plan measures and metrics • Test log • Test coverage reports
Test Program Status Tracking and EVMS Cont… • To effectively monitor testing progress and report to senior management, an earned value approach is used. • EVMS = one of the best ways to track progress • Tracks value of completed works and compares it to planned costs and actual • Provides true measure of schedule and cost status • Enables creation of effective corrective actions • 4 steps • Identify short tasks • Schedule each task • Assign a budget to each task • Measure progress of each task Earned value for work completed – planned budget = Schedule Variance Earned value for work completed – actual cost = Cost Variance
Test Metrics Collection and Analysis • Metrics = key indicators of test coverage progress and quality of test effort • Separated into white box and black box testing efforts • HOWEVER, gathering too many metrics can be time consuming
Test Metrics Collection and Analysis: White Box Testing • Targets application’s internal working • Measures the DEPTH of testing • Coverage Analysis • Code Analysis and Code Profiling e.g. Rational Quantify • Objective is to locate complex areas of source code • Fault Density (Fd) = Nd/KSLOC • Predicts the remaining number of faults • Design Complexity Metrics measures the number of ways a module can cal other modules
Test Metrics Collection and Analysis: Black Box Testing • Focus on the BREADTH of testing • Based on application’s external considerations • Each metric is assigned to one of three categories • Coverage • Progress • Quality
Test Metrics Collection and Analysis: Black Box Testing • Coverage • Test Coverage • System Coverage • Functional Coverage • Progress • Test Procedure Execution Status • Error Discovery Rate • Defect Aging • Defect Fix Rate • Defect Trend Analysis • Quality • Test Success Index • Defect Density • Test Effectiveness • Problem Report Acceptance Criteria Metric • Test Automation Metric
Test Execution Review • Stage 6 of ATLM • Review test performance e.g. metrics • Helps determine enhancements to be made • Document well performed activities • Lessons Learned should be acknowledged/documented • Aim to ensure best practices are used • Integrate a continual iterative learning process • Encouraged as improvement opportunities • “check pulse” • Documentation should be easily accessible to all team members • Corrective actions suggested must be evaluated • Represents he need for process improvement