120 likes | 153 Vues
Approaches to ---Testing Software. Some of us “hope” that our software works as opposed to “ensuring” that our software works? Why? Just foolish Lazy Believe that its too costly (time, resource, effort, etc.) Lack of knowledge
E N D
Approaches to ---Testing Software • Some of us “hope” that our software works as opposed to “ensuring” that our software works? Why? • Just foolish • Lazy • Believe that its too costly (time, resource, effort, etc.) • Lack of knowledge • DO NOT use the “I feel lucky” or “I feel confident” approach to testing - - - - although you may feel that way sometimes. • Use a methodical approach to testing to back up the “I feel ‘lucky/confident” feeling • Methods and metrics utilized must show VALUE • Value, unfortunately, often are expressed in negative terms • Severe problems that cost loss of lives or business • Problems that cost more than testing expenses and effort
Demonstrate Value of Testing • Catastrophic problems (e.g. life or business ending ones) do not need any measurements---but--- others do: • Measure the cost of problems found by customers • Cost of problem reporting/recording • Cost of problem recreation • Cost of problem fix and retest • Cost of solution packaging and distribution • Cost of managing the customer problem-to-resolution steps • Measure the cost of discovering the problems and fixing them prior to release • Cost of planning reviews and testing • Cost of executing reviews and testing • Cost of fixing the problems found and retest • Cost of inserting fixes and updates • Cost of managing problem-to-resolution steps • Compare the two costs & include loss of customer “good-will”
Conflicting Beliefs about Software • Software development is an art: • It takes artistic creativity to develop a competitive software; thus rigorous, repeatable techniques and testing is not needed. • This is pure garbage --- great artist go through and master very rigorous training in techniques and basics before they create their own material • Software development is science and engineering: • It takes system science and engineering to develop competitive and good quality software; thus creativity and artistic notions are irrelevant. • This is also pure garbage - - - “great” & “competitive” scientific and engineering products require innovation and artistic creativity • You NEED Both ! • Need innovation and experimentation • There are potentially too large a set of logical combinations • Repeatability and consistency • Good measurements and records must be kept
Need an “Engineering” Approach • Calling “programmers” and “testers” software engineers do not make them engineers. • Engineers apply science to their problem solving • Apply science to finding and resolving “bugs” • Plan for test (what activities, when and whom) • Devise test scenarios and test cases • Apply measurement and keep track of results • Fundamental rules for practicing engineering include: • Stating the methods followed and why • State the assumptions • Apply adequate factors of safety • Get a second opinion “where reasonable” (Not “always” as the book states)
Engineering Practices • Stating the methods used and why • Clearly define the methodology and perform accordingly (SEI & ISO) • (Try defining a methodology that describes test planning) • State all the assumptions • List the unknowns and dependencies (there are many in complex systems) • Examples for testing are: • What types of testing (functional, regression, etc.) • Operating environments included (db’s, networks, operating systems) • Test process and developers commitments • (any other example?) • Apply “adequate” factor of safety • For a system purported to handle x transactions/second, how much extra “buffer” would/should you be testing for? • Exactly x or 2x? • For testing efforts that took y people-time units before, would you use y or 1.2 y people-units? • Get a second Opinion • Inspections and reviews of test plans, test cases, tests results are very useful • Working “with” external parties to testing, such as development, is important: • Breaking “their” product or • Showing “their” product works to specification
Balancing Art and Science in Testing • When organizing and coordinating testing efforts, ---- utilize a lot of “communications” skills • Testing requires a “team” effort • Among testers • Among testing and other software development members • When planning, tracking, and reporting testing, ----- utilize a lot of tools and measurements skills from engineering • Testing is part of a process (e.g. design inspection, unit testing, functional testing, system testing) that requires • Tools (such as ?) • Measurements (such as?) • Testing do not always have to be (preferred though) experts in the application but: • Must use a systematic methodology and sound reasoning (explain?)
Bottom-up & Top-Down Testing Approaches • Bottom-up testing approach • Testing the individual pieces (via unit testing) • Combine the tested pieces and test --- continue until the whole is tested • Advantage : methodical and easy to pinpoint problems • Disadvantage: time consuming and integration problem detected late • Top-down testing approach • Bringing the completed the pieces and start testing the complete system • Integrating the new pieces as they are completed and continue the system test • Advantage: see the unexpected integration problem early • Disadvantage: harder to trouble shoot individually, badly tested pieces Author’s assumption of unit tested code on page 66 is a dangerous one! Why? What’s the definition of unit tested code? Who defined it?
Black Box and White Box Testing Approaches • Black box testing: • Looking at and the testing the system from the requirements perspective: outside-in • Do not look at the implementation • White box testing: • Looking at and testing the internals of the system and the code structures: inside –out • Develop test cases from the code structure • White box tests requirements but may miss extra material in the code while Black box tests what’s in the code but may miss non-implemented, but required functions
Comment on Author’s View (p.67) • “Interested in the Whole system, not any particular part” ------ agree? (yes, but what about helping in problem diagnosis and team play?) • “System is not a finite state machine” --- agree? (it may be a very large finite state machine --- that’s why we use bottom-up and slower process sometimes!) • “Do not know any company who would pay and use different groups of experts to test interface, unit, function, system, etc.” ---- agree? (many do, including MAPICS and IBM.)
Test Organization • Independent test group reporting to the same management as the development and the operations (or user) groups. • As part of the development organization • No formal test group and use temporarily formed group • As a part of the operations (or user) organization Advantage & Disadvantage of each proposed test organizational approach?
Need Multiple Views & Approaches • We need to consider all the approaches depending on the situation • Values of testing • Scientific/engineering and artistic views • Top-down and bottom-up • Black box & white box • Test organizations • Key is to determine what the goals of your testing are and then plan an approach or approaches that will achieve that goal
Goals of Testing? • Test as much as time allows us • Execute as many test cases as time allows? • Validate all the “key” areas • Test only the designated “key”requirements? • Find as much problems as possible • Test all the likely error prone areas and maximize test problems found? • Validate the requirements • Test all the requirements? State your goal(s) for testing. - - - what would you like people to say about your system ?