Download
slide1 n.
Skip this Video
Loading SlideShow in 5 Seconds..
TESTING… PowerPoint Presentation

TESTING…

476 Views Download Presentation
Download Presentation

TESTING…

- - - - - - - - - - - - - - - - - - - - - - - - - - - E N D - - - - - - - - - - - - - - - - - - - - - - - - - - -
Presentation Transcript

  1. TESTING…

  2. Overview • Motivation, • Testing glossary, • Quality issues, • Non-execution-based testing, • Execution-based testing, • What should be tested?, • Correctness proofs, • Who should perform execution-based testing?, • Testing distributed SW, • Testing Real-Time SW, • When testing stops? , • Summary.

  3. Motivation, • Testing glossary, • Quality issues, • Non-execution-based testing, • Execution-based testing, • What should be tested?, • Correctness proofs, • Who should perform execution-based testing? • Testing distributed SW, • Testing Real-Time SW, • When testing stops? • Summary.

  4. Motivation • SW life cycle models too often include separate testing phase… • Nothing could be more dangerous!, • Testing should be carried continuously throughout the SW life cycle.

  5. Motivation, • Testing glossary, • Quality issues, • Non-execution-based testing, • Execution-based testing, • What should be tested?, • Correctness proofs, • Who should perform execution-based testing? • Testing distributed SW, • Testing Real-Time SW, • When testing stops? • Summary.

  6. Testing – Glossary … • “V & V” VS Testing: • Verification – אימותDetermineif the phase was completed correctly. (Take place at the end of each phase)Boehm: Verification = “Are we building the product right?” , • Validation – תקפותDetermineif the product as a whole satisfies its requirementsTakes place before product is handed to the clientBoehm: Validation = “Are we building the right product?” .

  7. Testing – Glossary (Cont’d) • Warning: • “Verify” also used for all non-execution-based testing , • V&V might implies that there is a separate phase for testing. • The are two types of testing: • Execution-based testing, • Non-execution-based testing, • It is impossible to ‘execute’ the MRD or EPS , • On the other hand, is code-testing enough/efficient for the implementation phases?

  8. Motivation, • Testing glossary, • Quality issues, • Non-execution-based testing, • Execution-based testing, • What should be tested?, • Correctness proofs, • Who should perform execution-based testing? • Testing distributed SW, • Testing Real-Time SW , • When testing stops? • Summary.

  9. Quality … Quality…

  10. Quality (Cont’d) • Quality: • Peculiar and essential character, • An inherent feature, • Degree of excellence, • Superiority in kind , • An intelligible feature by which a thing may be identified.

  11. SW Quality … • In other areas quality implies excellence, • Not here! , • The quality of SW is the extent to which the product satisfies its specifications.

  12. , SW Quality (Cont’d) … • Very often bugs are found as the delivery deadline approaches: Release a faulty product or Late delivery, • Have a separate SW Quality Assurance (SQA) team, • Instead of 100 programmers devoting 30% of their time to SQA activities, Have full-time SQA professionals , • In small company utilize cross-review.

  13. SW Quality (Cont’d) • Managerial independence: • Development group, • SQA group ,

  14. The SQA Team Responsibilities • To ensure that the current phase is correct, • To ensure that the development phase have been carried out correctly, • To check that the product as a whole is correct, • The development of various standards and tools, to which the SW and the SW development must conform, [CMM level?] , • Establishment of monitoring procedures forassuring compliance with those standards.

  15. Motivation, • Testing glossary, • Quality issues, • Non-execution-based testing, • Execution-based testing, • What should be tested?, • Correctness proofs, • Who should perform execution-based testing? • Testing distributed SW, • Testing Real-Time SW, • When testing stops? • Summary.

  16. Non-Execution-Based Testing … • Underlying principles: • Group synergy, • We should not review our own work , • Our own blind-spot, Cover your right eye and stare at the red circle. Then, slowly move away from the page (or if you're already far, move toward the page). As you move away/toward the page (Do not look at the blue star), there should be a point where the blue star disappears from the picture. That is your blind spot!

  17. Non-Execution-Based Testing …

  18. Non-Execution-Based Testing (Cont’d) • Non-execution-Based Testing: • Walkthrough – סקירה, • Inspection – ביקורת, • (Peer-Reviews – סקר עמיתים) ,

  19. Walkthrough – The Team • 4–6 members, representative from: • Specification team member (document author?), • Specification team manager, • Client, • Next team (spec’s clients) , • SQA – chairman of the walkthrough.

  20. Walkthrough – Preparations • Set the team, • Distribute spec in advance, • Each reviewer prepares two lists: • Things he/she does not understands, • Things he/she thinks are incorrect , • Execute the walkthrough session(s).

  21. The Walkthrough Session … • Chaired by SQA (the one who will loose most..):whose roles are: • Elicit questions, • Facilitate discussion, • Prevent ‘point-scoring-session’, • Prevent annual evaluation session,(Remember team leader andteam manager …), • Up to 2 hours sessions, • Might be participant-driven or document-driven, • Verbalization leads to fault finding!, • Most fault are found by the presenter! ,

  22. The Walkthrough Session (Cont’d) • Detect faults – do not correct them!, • Why?, • Cost-benefit of the correction (6 members..), • Faults should be analyzed carefully, • There is not enough time in the session, • The ‘committee attitude.’,

  23. Inspection … • More formal process, with six-stages: • Planning – set the team, set schedule, • Overviewsession – Overview & doc. distribution, • Preparation – learn the spec, aided by statistics of fault types, that is – utilize the organization knowledge base, • Inspection – walkthrough the document, verifying each item. Formal summary will be distributed with summary and ARs. (Action required items: task and due dates), • Rework – fault resolving , • Follow-up – every issue is resolved: fix or clarification.

  24. Inspection (Cont’d) … • Team of five: • Moderator (I.E. – Spec team leader) – מתווך: • Manages the inspection – team and object, • Ensures that the team takes positive approach, • Specification author: • Answers questions about the product, • Reader: • Reads the doc aloud, • Recorder: • Documents the results of the inspection , • SQA, inspector, specialist: • Provides independent assessment of the spec.

  25. Inspections (Cont’d) • Use a checklist of potential faults: • Is each item of the spec correctly addressed?, • In case of interface, do actual and formal arguments correspond?, • Have error handling mechanism been identified?, • Is SW design compatible with HW design?, • Etc. , • Throughout the inspection: Faults recording.

  26. Fault Statistics • Recorded by severity and fault type. • Major.. (Premature termination, DB damage, etc..), • Or minor , • Usage of the data: • Compare with previous products, • What if there are a disproportionate number of faults in a specific module?, • Maybe – redesign from scratch?, • Carry forward fault statistics to the next phase , • Not for performance appraisal!

  27. Inspection – Example [Fagan 1976] • 100 person-hours task, • Rate of two 2-hours inspections per day, • Four-person team, • 100/ (5*4) = 5 working days, • 67% of all faults were detected before module execution-based testing! , • 38% fewer faults than a comparable product.

  28. Statistics on Inspections • 93% of all detected faults (IBM, 1986), • 90% decrease in cost of detecting fault (switching system, 1986), • 4 major faults, 14 minor faults per 2 hours (JPL, 1990). Savings of $25,000 per inspection , • Number of faults decreased exponentially by phase (JPL, 1992).

  29. Review Strengths and Weaknesses • Strengths: • Effective way of faults detecting, • Early detection, • Saving $ , • Weaknesses: • Depends upon process adequate (הולם, מספיק) , • Large-scale SW is extremely hard to review(unless modularity concept – OOP), • Depends upon previous phase documents , • Might be used for performance appraisal.

  30. Metrics for Inspections • Fault density: • Faults per page or – • Faults per KLOC, • By severity (major/minor), • By phase, • Fault detection rate (e.g. Faults detected per hour), • Fault detection efficiency (e.g. Faults detected per person/hour) , • What does a 50% increase in the fault detection rate mean?

  31. Motivation, • Testing glossary, • Quality issues, • Non-execution-based testing, • Execution-based testing, • What should be tested?, • Correctness proofs, • Who should perform execution-based testing? • Testing distributed SW, • Testing Real-Time SW, • When testing stops? • Summary.

  32. Execution-Based Testing • Definitions: • Failure (incorrect behavior), • Error (mistake made by programmer), • Nonsensical statement: • “Testing is demonstration that faults are not present.” , • Dijkstra: • “Program testing can be very effective way to show the presence of bugs, but it is hopelessly inadequate for showing their absence” [Dijkstra, 1972] ,

  33. What Is Execution-based Tested? • “The process of inferring certain behavioral properties of product based, in part, on results of executing product in known environment with selected inputs.” [IEEE 610.12, 1990], • Troubling implications: • Inference – היסק, היקש . • Trying to find whether there is a black cat in a dark room. • Known environment? • Neither the SW nor the HW are really known, • Selected inputs: • What about RT systems? (e.g. Avionic system) ,

  34. Motivation, • Testing glossary, • Quality issues, • Non-execution-based testing, • Execution-based testing, • What should be tested?, • Correctness proofs, • Who should perform execution-based testing? • Testing distributed SW, • Testing Real-Time SW, • When testing stops? • Summary.

  35. But What Should Be Tested? • Utility – תועלת , • Reliability – אמינות , • Robustness – חוֹסֶן , • Performance – ביצועים , • Correctness – נכונות .

  36. Utility – תועלת • Utility – the extent to which user’s needs are met when a correct product is used under conditions permitted by its specification, • Does it meet user’s needs? • Ease of use, • Useful functions, • Cost-effectiveness, • Utility should be tested first, and if the product fails on that score, testing should be stop ,

  37. Reliability –אמינות • Reliability – A measure of the frequency and criticality of product failure, • Frequency and criticality of failure: • MTBF – Mean Time Between Failures, • MTTR – Mean Time To Repair, • Mean time, cost to repair results of failure, • Suppose our SW fails only one every six month,but when it fails it completely wipes out a database. The SW can be re-run within 2hr.,but the DB reconstruction might take a week ,

  38. Robustness – חוֹסֶן … • Range of operating conditions:Possibility of unacceptable results with valid input,Effect of invalid input , • A product with a wide permissible operating conditions is more robust than a product that is more restrictive.

  39. Robustness – חוֹסֶן(Cont’d) • A robust product should not yield unacceptable results when the input satisfies its specifications , • A robust product should not crash when the product is not under permissible operating conditions ,

  40. Performance – ביצועים • Extent to which space and time constraints are met, • Real-time SW – hard-time constraints:Can the CPU process an image data within 5ms? (For a 200hz sampling rate?) ,

  41. Correctness – נכונות • A product is correct if it satisfies its output specifications, independent of its use of computing resources, when operated under permitted conditions ,

  42. Motivation, • Testing glossary, • Quality issues, • Non-execution-based testing, • Execution-based testing, • What should be tested?, • Correctness proofs, • Who should perform execution-based testing? • Testing distributed SW, • Testing Real-Time SW, • When testing stops? • Summary.

  43. Correctness Proofs (Verification) • Mathematical technique for showing that a product is correct, • Correctness means – It satisfies its specifications , • Is it an alternative to execution-based testing?

  44. Specifications Correctness … • Specification for a sort: • Are these good specifications?, • Function trickSortsatisfies these specifications:

  45. Specifications Correctness (Cont’d) • Incorrect specification for a sort: • Corrected specification for the sort:

  46. Correctness • NOT sufficient – as was demonstrated in previous example , • NOT a showstopper – consider a new compiler that is: • Twice faster, • Object code is 20% smaller, • Object code is 20% faster, • Much clearer error-code messages, • But – A single error message – for the first ‘for’ statement encountered in any class , • Will you use it?

  47. Correctness Proofs – Glossary • An assertion is:A claim that a certain mathematical property holds true at a given point, • An invariant is:A mathematically expression that holds true under all conditions tested , • An input specification is:A condition that holds true before the code is executed.

  48. , Example of Correctness Proof … • Code to be proven correct.

  49. Example (Cont’d) … • Flowchart of code segment:

  50. Example (Cont’d) …