1 / 20

Software Testing

Software Testing. Written by Zvika Gutterman Adam Carmi. Question?. Who has the highest Salary in Israel public service in 2001?. The test pilot of IAI !. Why?. Agenda. Goal Introduction Vocabulary White Box Black Box Static vs. Dynamic Example - TVRS. Goal.

Télécharger la présentation

Software Testing

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Software Testing Written by Zvika Gutterman Adam Carmi

  2. Question? Who has the highest Salary in Israel public service in 2001? The test pilot of IAI ! Why?

  3. Agenda • Goal • Introduction • Vocabulary • White Box • Black Box • Static vs. Dynamic • Example - TVRS

  4. Goal • The process of executing a program (or part of it) with the intention of finding errors. [Myers] • Any activity aimed at evaluating attribute of a program and determining that it meets its requires results. [Hetzel] • Myers: “The art of software testing” [1979] • Hetzel: “The Complete Guide to software testing” [1988]

  5. Introduction • 50-90% of software life cycleTVRS vs. Air Craft Industry • Usually not done by professionals • Not an exact science (can be impossible) • Complete coverage is impossible ! • Number of options is exponential – just RAM O(2^N) • Fixes introduce new bugs

  6. Introduction (cont.) • OS & Hardware dependent • Programming Language Dependent! • C++ vs. Java ! • Never Ends (client as a tester..) • There is always another Bug • Marketing vs. Testing • 1980’s – PC • 1990’s – Internet hype

  7. Introduction (cont.) • Requires writing additional code (which can include bugs) • Automatic tools for testing exists • Mainly for knows fields • Web servers (see ….) • Parasoft • Mercury Interactive

  8. Vocabulary • Bug, Fault, Error, Defect, Errata, .. , Feature • Fix, Patch, Service Pack, Update, Upgrade • Versions: Alpha, Beta, Release Candidate, 1.1.x vs. 1.2.x • Regression/Functional/White Box/Black Box/Unit/System/Integration Testing

  9. White Box – Unit Testing • Testing the Implementation Logic • Usually done by the programmer • (good/bad?) • Tests the source code • Example: • Implementing a Sorting Method • Merge Sort vs. Bubble sort

  10. Black Box – Functional Testing • Use Cases and Requirements based • Interface Testing • Input-Output Testing • Use Valid and Invalid inputs • Use “real” samples

  11. Black Box – Sorting Method • Unaware of the used algorithm • Sorting: • Use complete enumeration for small examples • 10 elements (10!) • Should we test again for 11 elements? • Check for equal values (2,2,2,5,3,4,7) • Which input ranges are valid? (use extremes) • Random Testing (why?)

  12. Testing affects results • Test Programs • Another Software is using the same resources • It is not the same code • Debug vs. Release • For example: counters initialized to zero (gdb) • assert(x=3);

  13. Acceptance Testing • testing • used to simulate client activities within the software house • used on a small group of clients • Advanced users? • Brave clients ..

  14. Static Testing • Static • Compiler • C++: -Wall, -pedantic, -ansi • Static Checker • – e.g. – Lint (http://lclint.cs.virginia.edu/) • Human code review • Formal verification

  15. Dynamic Testing • Runtime Monitors • Memory management • Debugger • Unique Tools – Purify/Bounds Checker • Timing • Quantify • Microsoft • Coverage • Pure coverage

  16. Testing Metrics • Number of bugs • Day/developer/team/product • Define severity levels • No Go! • .. • Feature • Regression Tools (re-approving the product) • Acceptance Testing • Poor man solution: • Time / Budget

  17. Black Box - Algorithm • For each Use Case • Identify different cases (named equivalent classes) • Identify endpoints • Create tests • Prefer automatic tests • Iterate on all possible Environments • Prioritize tests • Among Use Cases • Among different tests

  18. Example – TRVS – Black Box • Use case: 3 – find Policeman • Equivalent classes • Policeman with searched ID exists/not • Policeman with searched Name exists/not • Few policemen with searched Name exists • Policeman with searched ID just deleted • .. • Endpoints • Does system allow to enter ID’s with varying number of digits? • ..

  19. Example – TRVS – Black Box

  20. Example – TRVS – Black Box

More Related