Software Testing - PowerPoint PPT Presentation

test lifecycle and test process n.
Download
Skip this Video
Loading SlideShow in 5 Seconds..
Software Testing PowerPoint Presentation
Download Presentation
Software Testing

play fullscreen
1 / 124
Software Testing
241 Views
Download Presentation
oistin
Download Presentation

Software Testing

- - - - - - - - - - - - - - - - - - - - - - - - - - - E N D - - - - - - - - - - - - - - - - - - - - - - - - - - -
Presentation Transcript

  1. Test Lifecycle and Test Process Software Testing

  2. Test Level Process Test Planning and Control Test Planning and Control • Test Analysis and Design • Review test basis • Identify test condition • Decide test design tech. • Evaluate testability • Setup Environment • Define test scope, objective and goal • Test strategy • Risk analysis • Make strategy • Exit Criteria • Estimation • Organization • Scheduling • Test Mgmt. and control • Monitoring (Reporting) • Reporting planning / design • Status reporting Test Analysis and Design • Test Implementation and Execution • Specify TC, priority data, procedure • Pre Test • (Re) Test execution Test Implementation and Execution • Evaluating Exit criteria • Check exit criteria • Write summary report Evaluating Exit Criteria and Reporting • Test Closure Activities • Testware configuration • Evaluate Test Process Test Closure Activities

  3. Test Planning and Control Activities Determine scope / Objectives / Levels Risk Analysis Test Strategy Managing Test Process (Control) Test Estimation Test Planning Relationship with Stakeholder Reporting Test Activites

  4. Level’s Test Plan • Describe specific activities for each test level • when test level need to be extended from Master Test Plan • Provide detail task, work schedule and milestone for each level that are not dealt in Master Test Plan • Describe standard and template for test specifications for each test level • Planning is influenced by the test policy and strategy of the organization and aligned by Master test Plan • Maintenance test management documentation can replace a level test plan

  5. Master Test Plan • A master test plan is necessary when multiple test levels are used, it directs and controls each test level Master Test Plan Acceptance Test plan System Test plan Integration Test plan Component Test plan

  6. Test Plan Doc. And Template • Test plan documentation template has been developed and applied for commonality and readability in many organizations. • IEEE829 “Standard for Software Testing Documentation” provide testing documentation templates including test plan and planning activities.

  7. Test Planning Activities • Determining scope and risk, and identify the objectives of testing • Defining the overall approach of testing (test technique, test item, coverage, interface between stakeholders, testware) • Assigning resources for the different test activities (ex : HR, test environment, PC etc) • Implementation test strategy and aligning with test policy • Scheduling test analysis and design activities • Scheduling test implementation, execution and evaluation • Defining the exit criteria

  8. Test Policy • Describe the philosophy of testing organization • Apply to all testing performed on all projects within the organization (Generally short and simple, An executive-level document). • Definition of testing • Mission and target of testing (Quality level, main quality characteristics) • Strategic high level view on testing and testing duty • Core roles of testing organization • Define Test Process (including level of independence) • Testing approach for customer’s satisfaction • Test Process Improvement (Goal, KPI, Model) • Aligned with the quality policy of the organization

  9. Exit Criteria • Limited time and budget • Number of defects which is not fixed yet (Severity) • Number of retest • Number of defects per hour -> 0 • All test cases executed at least once (When those are wll designed an considered related risk & coverage) + No major defects • Prevented damage < cost of testing • Proper combination of above

  10. Test Control Activities • Measure and analyze test result • Monitoring and Reporting the status of test progress, test coverage and exit criteria • Corrective activities against Test Plan • Making decision of changing or holding of testing

  11. Traditional Approach for Test Strategy • Analytical approaches • Risk-based testing • Model-based approaches • Stochastic testing(reliability growth model, operational profiles) • Methodical approaches • Error guessing, fault attacks, checklist based, Quality characteristics based • Process or standard-compliant approaches • IEEE829 • Dynamic and heuristic approaches • Exploratory testing, bug based attack

  12. Test Strategy -1 • Describe approaches including product risk and project risks • High level documentation of testing (aligned with test policy) • Explain test levels to be executed • Entry or exit criteria for each test and overall guide about relationship between levels • Describe project and product risks and planning risk management • Explain the relationship between the risk and testing clearly • Test strategy could be established for organization or project • Project test strategy to be aligned with organizational test strategy

  13. Test Strategy -2 • Test strategy includes • Intergration Procedure • Test specification technique • Level of test independence • Mandatory/Optional standards • Test Achievement • Test Automation • Testware reusability • Retesting, Regression Testing • Test control, reporting • Measure and metrics for testing • Incident (Defect) management • Testware configuration

  14. Risk at Testing • Risk ? - considering limited time and budget • Defect -> failure -> Risk • Defect : specific cause of failure (related to product) • Failure : actual fail of the component or system from its expected function (Related to events) • Risk : Cost causes by failure (A factor that could result in future negative consequences) • RISK = likelihood of failure x damage • Likelihood of failure = freq of use x chance of fault

  15. Risk based strategy - Procedure Risk Management • Identify items where risks are possible • Analyze items whether they are important, complex and potential defects (determine the priority) • RISK = LIKELIHOOD x IMPACT • Planning to mitigate risk based on risk analysis (test strategy to reduce or mitigate risk) • Monitoring the risk and risk mitigation actions Identify Risk Analyze Risk Planning Risk Tracking Risk

  16. Risk based strategy – Identify Risk Risk Management • Classify functional / technical items • High Level test items based on the requirements • Low level test items based on the architecture • Brainstorming session would be helpful • Less than 35 risk items are recommended Identify Risk Analyze Risk Planning Risk Tracking Risk

  17. Risk based strategy – Analyze Risk Risk Management • Risk = likelihood x impact • Determine Risk factors based on defect patterns of previous project • Classify Risk into technical risk (unit/integration) and business risk (acceptance) Identify Risk Analyze Risk Planning Risk Tracking Risk Development Testing Likelihood - Technical Risk Acceptance testing Impact – Business Risk

  18. Risk based strategy – Analyze Risk Risk Management • Risk Factors (experience based, defect patterns/ history) • Factors for likelihood • Complexity • New development (level of re-uses) • Interrelationship (number of interfaces) • Size (line of code) • New Technology (difficulty) • Inexperience of development team • Factors for business impact • User importance (selling item) • Financial damage (e.g. safety) • Usage intensity • External visibility • Customization needed • Weighting can be applied Identify Risk Analyze Risk Planning Risk Tracking Risk

  19. Risk based strategy – Analyze Risk Risk Management • All possible stakeholders participate • Identify stakeholders (internal and external of the project) • Stakeholder thinks that all items are risky • Risk level : 9 = critical, 5= high. 3 = normal, 0=none • Try to reach a consensus through the meeting about the risk level. Identify Risk Analyze Risk Planning Risk Tracking Risk

  20. Risk based strategy – Analyze Risk Risk Management • Risk Matrix Identify Risk ITA STA Analyze Risk Planning Risk Discussion needed Tracking Risk STA – Severe Test Area STTA – Strong Test Area ITA – Intensive Test Area FTA – Fundamentals Test Area FTA STTA

  21. Risk Based Testing – Planing Risk • Example of test design technique – Low level test Risk Management ITA STA Identify Risk • Decision Coverage 70% • Peer Review • Formal Test Boundary • Value Analysis Decision • Decision coverage 70% • Code Inspection Analyze Risk Discussion needed Planning Risk Tracking Risk • Indirect Testing (Test More if possible) • Decision Coverage 70% FTA STTA

  22. Risk Based Testing – Planing Risk • Example of test design technique – High level test Risk Management ITA STA Identify Risk • Use case testing (basic flow only) • Equivalence partitioning • Use case testing (all flows included) • Decision table testing • Boundary value analysis • Pairwise testign Analyze Risk Discussion needed Planning Risk Tracking Risk • Use case testing (basic flow only) • Use Case testing (all flows included) • Equivalence partitioning FTA STTA

  23. Risk Based Testing – Planing Risk Risk Management Identify Risk Analyze Risk Planning Risk Tracking Risk

  24. Risk Based Testing – Planing Risk Risk Management • Risk Based Testing Strategy • Test specification technique • Exit Criteria • Test basis / test design review • Positioning test members • Re-testing • Regression testing • Test control, reporting • Level of test independence • Determining priority • Integration procedure • Mandatory/optional standards • Test environment • Test automation • Testware reusability • Measure & matrics for testing • Incident management • Testware configuration Identify Risk Analyze Risk Planning Risk Tracking Risk

  25. Risk Based Testing – Planing Risk • Risk matrix • Analyze requirement coverage by test case

  26. Risk Based Test Strategy - Macroscopic • Risk Based test Strategy in Test Plan • Each test level has its own risk based strategy

  27. Test estimation • Estimation method • Estimation of test effort based on the metrics of similar or previous projects • Estimation by the owner of task and experts • Based on test assignment, risk analysis and test strategy • Collection of experience-based estimates from task owners and experts for the detailed task • Consensus on the estimates with task owners an experts collectively • Other estimation methods • TPA (Test point analysis) • Estimation tools

  28. Test Environment • Define the physical and h/w test environments • CPU, RAM, VGA card, NIC.. • Other products or equipments that the system interacts with • Test equipment and infrastructure (server, chamber, NW traffic generator, telecommunication network facilities, broadcasting network etc) • Test automation tools (optiional) • Define the software test environment • OS • Browsers • DirectX • Application softwares

  29. Test Level Process Test Planning and Control Test Planning and Control • Test Analysis and Design • Review test basis • Identify test condition • Decide test design tech. • Evaluate testability • Setup Environment • Define test scope, objective and goal • Test strategy • Risk analysis • Make strategy • Exit Criteria • Estimation • Organization • Scheduling • Test Mgmt. and control • Monitoring (Reporting) • Reporting planning / design • Status reporting Test Analysis and Design • Test Implementation and Execution • Specify TC, priority data, procedure • Pre Test • (Re) Test execution Test Implementation and Execution • Evaluating Exit criteria • Check exit criteria • Write summary report Evaluating Exit Criteria and Reporting • Test Closure Activities • Testware configuration • Evaluate Test Process Test Closure Activities

  30. Test Analysis & Design Activities • Review the test basis • Identify test requirements, test conditions and test data based on analysis of items, the specification, behavior and structure • Designing and prioritizing test cases • Evaluating testability of the requirement and test objects • Designing the test environment set-up and identifying any required infrastructure and tools

  31. Test Designing Procedure • Designing Test against test condition • Test conditions : an item or even of a component or system that could be verified by one or more test cases, e.g a function, transaction, feature, quality attribute, or structural element. • Specifying Test Cases • Specifying Test Procedure (or test script) • Test Requirement documentations (Test basis) • Requirements • Architecture • Design • Interfaces • Programming specifications • Programming code • User manual

  32. Test Cases • Definition • A set of input values, execution preconditions, expected results and execution post-conditions, developed for a particular objective or test condition • Describe test condition to be tested + input values and expected result • A set of information or entity for test execution • Objective of test cases • To detect defects as many as possible with test cases • To guarantee test coverage

  33. Test case - example

  34. Test case and execution scope Implemented without requirement Implemented b y the requirement, but malfunction Requirement exist, but implemented Implemented without requirement and malfunction Software to be tested Requirement Requirement – Test Oracle Implemented system

  35. Categories of Test Design Techniques • Specification-based techniques • Models, either formal or informal, are used for the specification • From these models, test cases can eb driven systematically • Structure-based techniques • Information about how the software is constructed is used to derive the test cases, for example, code and design • The extent of coverage of the software can be measured for existing test cases, and further test cases can be derived systematically to increase coverage • Experience-based techniques • Knowledge of testers, developers, users and other stakeholders about the software, its usage and its environment • Knowledge about likely defects and their distribution • Documentation needed

  36. Specification-based techniques • Equivalence partitioning • Boundary value analysis • Pairwise testing • Decision table testing • State transition testing • Use case testing

  37. Write Test Cases • [Specification] • “The group of age 16 and 65 have to pay tax. If income is less than RM20,000, then the tax is due at 20% and the other case is due at 50%. If they have child, then 10% tax reduction will be applied.

  38. Equivalence Partitioning (EP) • Input / output spaces are divided into groups that are expected to exhibit similar behavior. So they are likely to be processed in the same way • Test can be designed to cover partitions • EP can be found for both valid and invalid data • Application procedure • Divide into equivalence classes, have similar characteristic using given information of systems (e.g program spec) • Select a data from each partitions • Make test cases to cover all valid equivalence classes • Make test cases to cover all invalid equivalence classes

  39. EP - Example • Requirement • Test cases 2.50V 3.30V 3.80V 4.30V 2.80V V < 2.50 2.50 <= V < 2.80 2.80 <= V < 3.30 3.30 <= V < 3.80 3.80 <= V < 4.30 4.30 <= V

  40. EP - Practice • Use EP to design a test cases for the specification given previously • Equivalence Classes • Test Cases

  41. Boundary Value Analysis • The maximum or minimum value of partitioning • Test cases 2.50V 3.30V 3.80V 4.30V 2.80V V < 2.50 2.50 <= V < 2.80 2.80 <= V < 3.30 3.30 <= V < 3.80 3.80 <= V < 4.30 4.30 <= V Boundary?

  42. Pairwise Testing • According to observation, interactions of two factors cause most defects -> deal with all combination pair with two factors • Each value of the parameter has to be pair with each values of the other parameters at least once only.

  43. Pairwise Testing

  44. Decision Table Testing • If the input conditions (or actions) can either be true or false, the decision table contains the triggering conditions, often combinations of true and false for all input conditions, and the resulting actions for each combination of conditions • Advantage • It detects problems of test basis such as requirement while designing test cases (meaningful participate in review meeting) • It detects incompleteness or ambiguousness of test basis • Disadvantage • It may cost much effort and times to make • Hard to design when the system is complicated and possible to make test cases mistakenly

  45. DT : example

  46. DT – Example 2 – Logical TC

  47. Example 2 - Physical TC

  48. State transition testing • Test technique to verify relationship between transition, events, actions, state and conditions (To cover all conditions between states and events – both valid and invalid) • To verify the system or the software if it fits with the state transition model • The defects from state transition testing are classified into state, transition guard and event • Defects could be found in a system (or software) or a specifications

  49. STT – classification of defect • Faults in model • Missing initial state • A guard is misplaced in a state instead in a transition • Guard overlapped • Could be found by inspection or static analysis • Faults in implementation • Extra / missing / corrupt state • Missing / wrong action • Sneak paths; trap doors, back door.. • Could be found by dynamic test

  50. STT - example Vending Machine :Service The Customer evCancel/ReturnDeposit(); ResetLights() evSodaSelectButton/ReleaseCan();ReturnChange90;UpdateStock() Accepting Soda Selection evCoinDrop [deposit >= price] evCoinDrop [deposit >= price] evCoinDrop [deposit < price] evPowerOn Standby evCoinDrop [deposit < price] Accepting Coins evCancel/ReturnDeposit()