1 / 61

TESTING

This presentation covers the basics of testing in software engineering, including the different types of faults, testing issues, unit testing, integration testing, and test planning.

cmark
Télécharger la présentation

TESTING

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Software Engineering TESTING Compiled by: Dr. S. Prem Kumar Professor & HOD CSE G. Pullaiah College of Engineering and Technology, Kurnool Source: These slides are designed and adapted from slides provided by Software Engineering: A Practitioner’s Approach, 7/e (McGraw-Hill 2009) by Roger Pressman and Software Engineering 9/eAddison Wesley 2011 by Ian Sommerville Department of Computer Science and Engineering

  2. Testing the Programs

  3. Contents • Software Faults and Failures 2 Testing Issues 3 Unit Testing 4 Integration Testing 5 Testing Object Oriented Systems 6 Test Planning 7 Automated Testing Tools 8 When to Stop Testing 9 Information System Example 10 Real Time Example 11 What this Chapter Means for You Department of Computer Science and Engineering

  4. Chapter 8 Objectives Types of faults and how to clasify them The purpose of testing Unit testing Integration testing strategies Test planning When to stop testing Department of Computer Science and Engineering Department of Computer Science and Engineering

  5. 1 Software Faults and FailuresWhy Does Software Fail? Wrong requirement: not what the customer wants Missing requirement Requirement impossible to implement Faulty design Faulty code Improperly implemented design Department of Computer Science and Engineering

  6. 1 Software Faults and FailuresWhy Does Software Fail? Wrong requirement: not what the customer wants Missing requirement Requirement impossible to implement Faulty design Faulty code Improperly implemented design Department of Computer Science and Engineering

  7. 1 Software Faults and FailuresWhy Does Software Fail? Wrong requirement: not what the customer wants Missing requirement Requirement impossible to implement Faulty design Faulty code Improperly implemented design Department of Computer Science and Engineering

  8. 1. Software Faults and Failures Objective of Testing • Objective of testing: discover faults • A test is successful only when a fault is discovered • Fault identification is the process of determining what fault caused the failure • Fault correction is the process of making changes to the system so that the faults are removed Department of Computer Science and Engineering

  9. 1. Software Faults and FailuresTypes of Faults • Algorithmic fault • Computation and precision fault • a formula’s implementation is wrong • Documentation fault • Documentation doesn’t match what program does • Capacity or boundary faults • System’s performance not acceptable when certain limits are reached • Timing or coordination faults • Performance faults • System does not perform at the speed prescribed • Standard and procedure faults Department of Computer Science and Engineering

  10. 1 Software Faults and FailuresTypical Algorithmic Faults An algorithmic fault occurs when a component’s algorithm or logic does not produce proper output Branching too soon Branching too late Testing for the wrong condition Forgetting to initialize variable or set loop invariants Forgetting to test for a particular condition Comparing variables of inappropriate data types Syntax faults Department of Computer Science and Engineering

  11. 1 Software Faults and FailuresOrthogonal Defect Classification Department of Computer Science and Engineering

  12. 1 Software Faults and FailuresSidebar 1 Hewlett-Packard’s Fault Classification Department of Computer Science and Engineering

  13. 1 Software Faults and FailuresSidebar 1 Faults for one Hewlett-Packard Division Department of Computer Science and Engineering

  14. 2 Testing IssuesTesting Organization • Module testing, component testing, or unit testing • Integration testing • Function testing • Performance testing • Acceptance testing • Installation testing Department of Computer Science and Engineering

  15. 2 Testing IssuesTesting Organization Illustrated Department of Computer Science and Engineering

  16. 2 Testing IssuesAttitude Toward Testing • Egoless programming: programs are viewed as components of a larger system, not as the property of those who wrote them Department of Computer Science and Engineering

  17. 2 Testing IssuesWho Performs the Test? • Independent test team • avoid conflict • improve objectivity • allow testing and coding concurrently Department of Computer Science and Engineering

  18. 2 Testing IssuesViews of the Test Objects • Closed box or black box: functionality of the test objects • Clear box or white box: structure of the test objects Department of Computer Science and Engineering

  19. 2 Testing IssuesWhite Box • Advantage • free of internal structure’s constraints • Disadvantage • not possible to run a complete test Department of Computer Science and Engineering

  20. 2 Testing IssuesClear Box • Example of logic structure Department of Computer Science and Engineering

  21. 2 Testing IssuesSidebar 2 Box Structures • Black box: external behavior description • State box: black box with state information • White box: state box with a procedure Department of Computer Science and Engineering

  22. 2 Testing IssuesFactors Affecting the Choice of Test Philosophy • The number of possible logical paths • The nature of the input data • The amount of computation involved • The complexity of algorithms Department of Computer Science and Engineering

  23. 3 Unit TestingCode Review • Code walkthrough • Code inspection Department of Computer Science and Engineering

  24. 3 Unit TestingTypical Inspection Preparation and Meeting Times Department of Computer Science and Engineering

  25. 3 Unit TestingFault Discovery Rate Department of Computer Science and Engineering

  26. 3 Unit TestingSidebar 3 The Best Team Size for Inspections • The preparation rate, not the team size, determines inspection effectiveness • The team’s effectiveness and efficiency depend on their familiarity with their product Department of Computer Science and Engineering

  27. 3 Unit TestingProving Code Correct • Formal proof techniques • Symbolic execution • Automated theorem-proving Department of Computer Science and Engineering

  28. 3 Unit TestingProving Code Correct: An Illustration Department of Computer Science and Engineering

  29. 3 Unit TestingTesting versus Proving • Proving: hypothetical environment • Testing: actual operating environment Department of Computer Science and Engineering

  30. 3 Unit TestingSteps inChoosing Test Cases • Determining test objectives • Selecting test cases • Defining a test Department of Computer Science and Engineering

  31. 3 Unit TestingTest Thoroughness • Statement testing • Branch testing • Path testing • Definition-use testing • All-uses testing • All-predicate-uses/some-computational-uses testing • All-computational-uses/some-predicate-uses testing Department of Computer Science and Engineering

  32. 3 Unit TestingRelative Strengths of Test Strategies Department of Computer Science and Engineering

  33. 3 Unit TestingComparing Techniques • Fault discovery Percentages by Fault Origin Department of Computer Science and Engineering

  34. 3 Unit TestingComparing Techniques (continued) • Effectiveness of fault-discovery techniques Department of Computer Science and Engineering

  35. 3 Unit TestingSidebar 4 Fault Discovery Efficiency at Contel IPC • 17.3% during inspections of the system design • 19.1% during component design inspection • 15.1% during code inspection • 29.4% during integration testing • 16.6% during system and regression testing • 0.1% after the system was placed in the field Department of Computer Science and Engineering

  36. 4 Integration Testing • Bottom-up • Top-down • Big-bang • Sandwich testing • Modified top-down • Modified sandwich Department of Computer Science and Engineering

  37. 4 Integration TestingTerminology • Component Driver: a routine that calls a particular component and passes a test case to it • Stub: a special-purpose program to simulate the activity of the missing component Department of Computer Science and Engineering

  38. 4 Integration TestingView of a System • System viewed as a hierarchy of components Department of Computer Science and Engineering

  39. 4 Integration TestingBottom-Up Integration Example • The sequence of tests and their dependencies Department of Computer Science and Engineering

  40. 4 Integration TestingTop-Down Integration Example • Only A is tested by itself Department of Computer Science and Engineering

  41. 4 Integration TestingModified Top-Down Integration Example • Each level’s components individually tested before the merger takes place Department of Computer Science and Engineering

  42. 4 Integration TestingBing-Bang Integration Example • Requires both stubs and drivers to test the independent components Department of Computer Science and Engineering

  43. 4 Integration TestingSandwich Integration Example • Viewed system as three layers Department of Computer Science and Engineering

  44. 4 Integration TestingModified Sandwich Integration Example • Allows upper-level components to be tested before merging them with others Department of Computer Science and Engineering

  45. 4 Integration TestingComparison of Integration Strategies Department of Computer Science and Engineering

  46. 4 Integration TestingSidebar 5 Builds at Microsoft • The feature teams synchronize their work by building the product and finding and fixing faults on a daily basis Department of Computer Science and Engineering

  47. 5 Testing Object-Oriented SystemsQuestions at the Beginning of Testing OO System • Is there a path that generates a unique result? • Is there a way to select a unique result? • Are there useful cases that are not handled? Department of Computer Science and Engineering

  48. 5 Testing Object-Oriented SystemsEasier and Harder Parts of Testing OO Systems • OO unit testing is less difficult, but integration testing is more extensive Department of Computer Science and Engineering

  49. 5 Testing Object-Oriented SystemsDifferences Between OO and Traditional Testing • The farther the gray line is out, the more the difference Department of Computer Science and Engineering

  50. 6 Test Planning • Establish test objectives • Design test cases • Write test cases • Test test cases • Execute tests • Evaluate test results Department of Computer Science and Engineering

More Related