Profound
Uploaded by
33 SLIDES
332 VUES
330LIKES

Chap 6 - Test Design

DESCRIPTION

Chap 6 - Test Design

1 / 33

Télécharger la présentation

Chap 6 - Test Design

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Test Design

  2. Test Design ● Before starting Test Design phase; Test Team should gather all required understanding of Application functionality ● Application is newly developed Use SRS, Use cases, Wireframes, prototype  ● Application is getting enhanced/ already in use Demo of existing application Previous testing artifacts like Test scenarios, Test case, Defects, Test Plan Change request    2

  3. New title Test Design Continued ● Tester should gain answers to following questions What are various features/ functionalities of the Application? What are the various Roles for accessing this Application? What features are available for each Role? What data is getting captured? How does this data flows within the Application? What are validation rules for input data? Which functionalities are dependent on each other and how? What are business rules? How they are implemented? What is the category of Application? Web site which will be used globally Real time- Share Market – should show changing information quickly Does Application interfaces with some other Application? Does Application takes inputs from some other Application? 3

  4. Test Scenario ● Broader level functionalities of the Application can be defined as Test Scenarios ● Test Scenario are designed based on SRS High level features Use Cases Actor interacting with an Application processes where each process is a Test Scenario State transition diagram of an Entity Leave gets Submitted->Approved->Sanctioned or leave gets submitted->Rejected->Canceled       4

  5. Use Case 5

  6. Use Case Continued ●A use case describes interactions between Actors (users) and the Application ●Each use case has number of processes which are features/ functionalities of the Application ●Each process typically forms a test scenario 6

  7. Test Scenario Continued ●Test Scenarios help In developing test cases One scenario can lead to multiple test cases In developing END-TO-END scenarios ●Can be used in Integration, System testing and in UAT 7

  8. Test Scenario Continued ● Example of an e-mail Application ● Test scenarious will be Check functionality of Compose new mail Check functionality of Receiving mails Check functionality of sending mail Check functionality of Replying to received mail Check functionality of Forwarding the received mail Check functionality of printing the receive mail Check functionality of deleting the e-mail Check functionality of saving e-mail as draft 8

  9. Test Case ● Test cases are derived from Test Scenario Use Cases interactions are traced for Happy flow, Error flow i.e. specific test condition    ● They describe “how” to test functionality with various input data Describes sequence of steps to be done Test data to be used at each step   ● Test cases are used for Test Execution 9

  10. Test Case Continued ●Test Case defines  Objective of a test case  Which test condition is getting checked?  Pre-requisite  Conditions to be met before starting particular test case execution  Steps & Data  Sequence of operations to be performed on the Application and test data to be used 10

  11. Test Case Continued ● Expected Result Describes how Application should respond to the steps performed  ● Actual Result Tester needs to observe the actual response given by the Application This will be recorded during Test Execution   ● Status Based on Actual Result and Expected Result Test case is marked as “Pass” (if they match) or “Fail”(if they don't match)  ● A unique Test case Id should be given to each Test Case while documenting 11

  12. Test Case Continued Test Case ID Email_App _Com_001 Objective Pre - requisite One word document should be available of size less than 10 MB Steps & Data 1. Click on compose email. 2. Type some message in e-mail body and provide subject line 3. Attach word document Expected Result 1. Compose e-mail window should get opened. 2. Document should get attached to mail. Actual Result Status To check functionality of compose e-mail with an attachment which is less than 10 MB 12

  13. Test Case Continued ● Points to remember while developing/ writing Test Cases Every test case may not have a pre-requisite and when mentioned should be very clear Steps should be in correct sequence and should mention exact test data to be used Expected result should be clearly mentioned so that test case evaluation becomes possible Expected result for each test step should be clearly documented Do not combine multiple Test objectives into one in order to reduce number of Test Cases Write Test Cases which are simple to understand and follow Do not assume any functionality while writing Test Case Develop test cases only based on - Test scenarios, Use Cases, SRS, State transition, Data flow diagrams         13

  14. Test Cases Review ● Test Cases are one of the key deliverables of Test Design phase ● Test Cases are reviewed by Test Lead ● Review of Test Cases checks for Is Test coverage adequate?(Test Objective column of Test cases is referred) Are all Test Scenarios covered? Are all possible Test Conditions for each Test Scenario are covered? (Happy flow, all paths of Happy flow based on input data, Error flow) Is RTM complete in terms of Testing each User Requirement     14

  15. Test Cases Review (Contd.) ●Each Test Case can stand in isolation ●Expected result is explicitly mentioned and there is no room for different interpretation ●Test cases follow correct document standard  Right template followed as per company standard  Use of Test management tool  Test Cases created and maintained under correct centralized location / Project 15

  16. Test Case Management Tools ●Test Case Management Tools are used to Create and Maintain Test cases on various projects in an Organization ●In case Organization does not use any Test Management Tool then they are maintained at some centralised location on server where testers are provided access ●Commonly used tools are Quality Centre, Testlink 16

  17. Test Case Management Tools ●Features in any Test Management Tools Allows creation of Project Provides access only to required Testing team members Allows creation of Test Suit Mapping of Test case to Requirement Import / export Test Cases to Excel Select Test cases for execution Provides basic and advance Search facilities to locate test cases Version tracking of Test Cases Execution of Test cases Reports on various criteria 17

  18. Test Data ●It is data used while executing an Application / Executing a Test case ●Test Data can be of any type  a Text, number / numeric  Selected from a drop down list  Selected from a radio button  Selected from a checkboxes  A form of file of type which Application loads ●How Test data is created? Data is created by Testers based on Test Conditions Live / Production data is used for testing 18

  19. Test Data Continued ● Single Test case can be executed multiple times using different sets of data which provide same test result i.e. Expected Result is same ● For different sets of data, Application behavior can be different i.e. Expected Result is different hence separate Test cases are designed ● Test Steps may be same but based on Test data used; Application expected result / response can be different Happy flow Alternative flow Error flow    ● Use of correct Test data is crucial in testing an Application and depends on requirement ● Various Test data creation techniques are used to achieve effective Testing Boundary Value Analysis Equivalence Class Partitoning Error Guessing Decision Table State Transition Diagram Negative Testing       19

  20. Boundary value analysis (BVA) ●It is a Test Data design technique in which tests are designed to include representatives of boundary values. The test cases are developed around the boundary conditions ●Applications often change their behaviour at boundary hence Test data is selected which represents boundary and just beneath the boundary 20

  21. Boundary value analysis (BVA) ●E.g. To test that system accepts numbers in the range of 1 to 100 ●We need to test for values 0,1,2,99,100,101 that will give us the confidence to say that application behaves correctly at boundary  Values 1,2,99,100 should be accepted .... Happy flow  User will be able to proceed further  Values 0, 101 should not be accepted.... Error flow  Application will display an error message and will not allow to proceed further ●BVA can not be used when User input expected is of type Boolean (true / false values) ●BVA need not have always 2 boundaries (Upper and Lower Boundary)  E.g. Leaves above 20 can be carried forward to next year  Here 20 leaves is Boundary condition hence we need to test with three Employees having 19 leaves, 20 leaves, 21 leaves respectively  Only for an employee having 21 leaves; leaves should get carried forward to next year 21

  22. Equivalence Class Partitioning (ECP) ●Design technique that reduces input test data to practicable size and still achieve required test coverage ●Set of input Test data for which Application behaves in the same manner are considered to be in Equivalence class ●Instead of using all input values from this class; few representative values are considered 22

  23. Equivalance Class Partitioning (ECP) Continued ●E.g. System accepts inputs only between 1 to 1000 ●Identify Equivalence classes  Valid class – 1- 1000  Select any value between 1- 1000 as a test data and as a representative of valid class  Invalid class - <1 and >1000  Select any value less than 1 or greater than 1000 as test data and as representative of invalid class 23

  24. Error guessing ●Error guessing technique is based on experience and intuition ●Requires experience of the functional domain and typical error prone areas Knowledge of similar kind of Applications Knowledge on previous Defect history of the Application Knowledge on typical coding mistakes by developers    ●Objective is to use right test data that would expose errors 24

  25. Decision Table 25

  26. Decision Table ●It is a table showing combinations of input with their associated outputs / actions ●Focussed on business logic ●Suitable when different combinations of inputs result in different actions being taken ●Also referred as cause-effect table ●Test Case should be designed for each Rule in decision table 26

  27. Decision Table Continued 27

  28. Decision Table Continued 28

  29. State Transition Testing ●Used when system under test can be in some finite States and transition from one state to another is determined by some rules. ●State transition Diagram will have four parts :  States that system can occupy  Transitions from one state to another (not all transitions are allowed)  Events that cause a transition  Actions that result from transition 29

  30. State Transition Testing S1 Empty Add Item Remove Last Item S3 Summary & Cost S2 Shopping S4 Payment Check Out Test case covering all possible valid transactions: S1->S2->S1->S2->S3->S2->S3->S4 30

  31. State Transition Testing 31

  32. Negative Testing / Test-to-fail ● Testing Application with Invalid data Data which is out of specified range Data in wrong format e.g. dd/mm/yy instead dd/mm/yyyy Wrong data type e.g. text is entered instead of numeric value Wrong file format e.g. wrong format excel sheet is used as test data. Application was expecting data in first 4 columns but used excel sheet has only 3 column data     ● Application is expected to handle invalid data correctly Application should display appropriate Error message to guide user to enter Valid data and should not malfunction  ● Design test cases with invalid test data to check whether Application behaves as per expected result 32

  33. Best Practices of Test Design ●Design Test cases which are  Accurate: Tests what it is expected to test  Economical: Is just right in terms of details  Traceable: Traceable to a requirement  Appropriate: For the test environment  Self standing: Independent of the author ●Use Test Data Techniques  BVA  ECP  Error guessing  Decision tables  State transition diagram  Negative testing 33

More Related