1 / 23

Testing Techniques in Software Engineering

Explore various testing techniques used in software engineering including black-box testing, white-box testing, module testing, integration testing, and more.

Télécharger la présentation

Testing Techniques in Software Engineering

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. BACKGROUND • There is no way to generally test programs exhaustively (that is, going through all execution paths with all possible values). • Therefore testing can not guarantee correctness. • In spite of this, testing is a very important way to improve quality of programs and to eliminate errors. Software Engineering 2004 Jyrki Nummenmaa

  2. OO TESTING • OO software can be designed and built in many ways and using different kinds of modeling and documentation techniques. • Testing should reflect these techniques and models (if, for instance, you use use cases in analysis and design, then you should use them as input for testing as well). Software Engineering 2004 Jyrki Nummenmaa

  3. BLACK-BOX TESTING • A set of inputs is produced based on the knowledge of the functional specification of software. • Software is executed and the outputs are compared to correct outputs. • Testing is based on the functional specification and not on knowledge of the internals of the software (software is viewed as a black box). Software Engineering 2004 Jyrki Nummenmaa

  4. WHITE-BOX TESTING • In creation of test data, also knowledge of software implementation is used. • This way, testing can be aimed to towards expected weak spots in the implementation. Software Engineering 2004 Jyrki Nummenmaa

  5. MODULE TESTING • A single software module (e.g. a class in OO programming) is tested. • The modules often need other modules, and these modules may not have been implemented yet. • Therefore, module testing in general needs a test environment, which potentially has some ”dummy implementations” of other modules – they could, for instance, just return a constant value, etc. • Usually done by the software developer. Software Engineering 2004 Jyrki Nummenmaa

  6. Class testing (OO module testing) • Follows unit/module testing principles. • Typically classes represent an encapsulated (and hopefully rather independent) unit. • As unit testing in general, this is typically the software developers job. Software Engineering 2004 Jyrki Nummenmaa

  7. INTEGRATION TESTING • Modules are integrated to get subsystems. • These subsystems are tested. • There is a number of ways in which the modules may not work together. • This testing stage may find new module-level errors. • Also, higher-level design flaws may be detected. Software Engineering 2004 Jyrki Nummenmaa

  8. Increment integration testing • When using incremental software development, you typically first integrate test the new increment. • Then you integrate test the new increment with the existing system. EARLIER INCREMENTS NEW INCREMENT Software Engineering 2004 Jyrki Nummenmaa

  9. VALIDATION TESTING • Validating the implemented system against the requirements. • At this point, use cases should be executed. • If use cases are described using sequence diagrams or activity diagrams, it would be good to track down the executions and compare them to the diagrams (unfortunately there is not much tool support…) • You as a software developer should initially check if you can run through the use cases. Software Engineering 2004 Jyrki Nummenmaa

  10. SYSTEM TESTING • The whole system (hardware and software components) are tested together. • Testing should put the system under a realistic load. • Stress testing is testing the system under a high load. In some systems, this is important to do at this point. Software Engineering 2004 Jyrki Nummenmaa

  11. ALPHA AND BETA TESTING • Alpha testing: real users test the system on the software producers computers. • This is often not so realistic anymore. • Beta testing: real users test the system on their computers. Software Engineering 2004 Jyrki Nummenmaa

  12. REGRESSION TESTING • Re-executing the already passed tests. • This is important after the software has been changed or fixed, as fixing something may well break something else. • In regression testing, test automation is highly important and valuable. • We want to have a repeatable testing process. Software Engineering 2004 Jyrki Nummenmaa

  13. GENERAL ADVICE • When you develop your software, you test it. • Build your tests as programs or methods of your classes and store your inputs. • This helps you in regression testing (when you have changed something you will want to know if you have broken something that was ok). • Document the test set you create (at least make a list of test material you have and how to run them). Software Engineering 2004 Jyrki Nummenmaa

  14. AUTOMATED TESTING • Testing takes a lot of time and effort. Typically a large number of tests need to run several times on software. • Therefore, there is a need to automate testing:Make a computer program to do the following: execute a predefined test set, compare the results with expected results, and reported the deviations. • This typically works better in module testing. • However, even GUI testing automation systems now exist, and they provide a massive speedup in testing as compared to human testers. • Lots of tools for automated testing exist. Software Engineering 2004 Jyrki Nummenmaa

  15. REPEATABILITY • Same tests are run over a number of times and their results are compared. • Clearly, we want these results to be comparable. • It follows that we would want the same test to give the same result every time, if it is successful (and preferably the same result every time, when the same error is present). • This puts limitations e.g. to randomization, where the same seed should be used every time, and extra care needs to be taken when human interaction is included in testing. Software Engineering 2004 Jyrki Nummenmaa

  16. WRITING TESTABLE SOFTWARE • Write your code (classes etc.) with testing facilities: self tests, output routines, etc. • This is not a big effort, when you create the software. Afterwords it will be a huge job. • You can use these routines constantly to test your code as you develop it. Software Engineering 2004 Jyrki Nummenmaa

  17. TEST COVERAGE • Two ways to think about this:- How well do the tests cover the required functionalities of the application?- How well do the tests cover the implementation? • Q: What’s the difference between these two? • A: - The implementation usually includes things that are not required. - Sometimes some required functionality is completely missing. Software Engineering 2004 Jyrki Nummenmaa

  18. Dead code • if ((a<0) && (b<a)) {… }else if ((a>=0) && (b<a)) {…} else if (b<a) { dead code } • Sometimes dead code is intentionally written to manage ”impossible situations, just in case somebody changes the code to make the possible” or ”error management for errors that should be impossible to take place”. • Dead code is not very rare. Software Engineering 2004 Jyrki Nummenmaa

  19. Code coverage • Some testing tools do bookkeeping on the lines of code that are visited and calculate a percentage of visited lines over all lines. • Because of dead code, a 100% code coverage may not be possible. • Note that a 100% code coverage does not mean ”complete testing”, it just means that the code on each line has been executed at least once, but minimally a line has been executed with only one combination of variable and parameter values. Software Engineering 2004 Jyrki Nummenmaa

  20. TEST DATA DESIGN • Test for extreme/boundary values. • Test for incorrect input. • Test for cases known to be difficult. • Test for different combinations of extreme/boundary values of parameters. Software Engineering 2004 Jyrki Nummenmaa

  21. TEST DOCUMENTATION • An overall testing plan for the software • Include criteria for accepting and rejecting tests and for the situations, when future tests are not possible. • Each test case: method and input • Each test execution: date, user, outcome • This is all, of course, only when the software moves into “official” testing. • Resources: http://www.rspa.com -> SE Resources -> Software Engineering Documents -> Software Test Plan – Sample Software Engineering 2004 Jyrki Nummenmaa

  22. EXTERNAL TESTING • Sometimes it is even required that the software has been tested by an external testing authority (e.g. Nokia Forum requires this for programs that are traded through them) • There are now companies, which concentrate primarily on testing. • Testing, of course, requires a lot of programming these days. • Sometimes the testing is even taken into another country. Software Engineering 2004 Jyrki Nummenmaa

  23. DEBUGGING • Debugging is finding the cause of the error. • Sometimes the symptom appears quite far from the cause of the error. • Use debuggers or printing out information to detect how things are going. • If you are at a dead end, get help. (Often it is easier for other people to solve this, as you may have become blind to the situation and to your own errors.) Software Engineering 2004 Jyrki Nummenmaa

More Related