330 likes | 463 Vues
This resource provides an in-depth overview of software testing principles, including key terminology such as QA vs. QC, the importance of requirements, configuration control, and the development of robust test plans and test cases. It covers various testing methodologies like black-box and white-box testing, and different levels of testing from unit to system testing. Discover best practices for bug tracking, shipping decisions, and ensuring quality through effective documentation and user-focused testing strategies. Ideal for those looking to improve their software testing knowledge.
E N D
Software Testing 101 By Rick Clements cle@cypress.com
Overview • Terminology • Requirements • Configuration control • Test plan • Test cases • Test procedures • Bug tracking & ship decision • After the testing
Terminology QA vs. QC • Quality assurance (QA) - The management of quality. The encouragement of best practices. QA is about preventing defects. • Quality control (QC) - Validation and verification of the product. This includes reviews, inspections and testing.
Terminology Test Documents • Test plan (or validation plan) - The plan describing the general approach, the people required, equipment required and schedule for testing and other validation activities • Test cases - Specific data used to test the product. • Test procedures - The procedures to follow in testing the product. These may be manual or automated.
Terminology Types of Tests • Black box testing - Testing by looking at the requirements to develop test cases. System level testing is often black box testing. • White box testing - Testing by looking at the structure of the program to develop test cases. White box testing often occurs in unit testing.
Terminology Levels of Testing • Unit testing - The process of running small parts of the software. The design teams often handle unit testing. • Integration testing or Interface testing - The testing of pre-tested modules and hardware to determine that they work together. • System testing - The testing of the entire product to see that it meets its requirements. This occurs after the product has been integrated.
Software Development Process Software Implementation & Debug Software Design Define Requirements Software Test Test Implementation & Debug Test Design Software Release
Requirements • Why are they important? • How are they documented? • What’s important? • What if you don’t have any requirements?
Requirements Why Are Requirements Important? • How do the designers know what to build? • How can you test that it was built correctly? • When you disagree, who’s right?
Requirements Documenting Requirements • Data sheet • Requirements specification • Functional specification • Data base • Traceability matrix
Requirements What’s Important • They exist • They are unambiguous and testable • Cover all of the customers (not just final customer) • They are under configuration control
Requirements No Documented Requirements • Ask the project manager • Ask the marketing representative • Has anything been sent to the customer? • Ask a domain expert • What are the designers building? • Write them down
Configuration Control • Why is it a testing issue? • What to track • Build & Version Number
Configuration Control Why Is It A Testing Issue? • Ship the version that was tested • A single test system failing • Modules accidentally reverting to older version • Re-create software and tests
Configuration Control What To Track • Requirements • Software • Hardware • Tests
Configuration Control Version & Build Number • Simple to generate • Unique for each build or change • Readily visible and validated for correctness
Test Plan • What will be tested and not tested? • What approach will be taken? • What resources are needed? • Which people are needed? • Schedule
Test Cases • Test boundary conditions • System interfaces • Where have other errors been found? • What will the users do most often? • What is the most serious if it fails? • Usability • What is unique to your product's environment?
Test Cases Boundary Conditions • Values at and away from the boundaries • Valid and invalid values • Both numeric values and strings • Minimum-1, minimum, maximum, maximum+1, a good middle value
Test Cases Where Have Errors Been Found? • Errors tend to cluster • Can a common error be fixed? • Would a code review be useful?
Test Cases Usability • Often over looked • First to see software outside of design • The interface makes sense if you know the design • Need to know your users
Test Cases Interfaces • User interface • Interfaces between software modules • Interfaces to other programs • Hardware / software interfaces
Test Cases What Will Users Do Most Often? • Frequently encountered errors impact the user more • Test heavily used areas more heavily • Less used areas can’t be ignored
Test Cases What Failures Are Most Serious? • Areas data could be lost • Errors with a large financial impact • Errors that could injure someone
Test Cases Unique to Web Applications • Versions of browsers • Different operating systems • Server capacity • Multiple servers - one fails?
Test Cases Unique to GUI Based Application • System configuration changes • Size of the screen • Number of colors available • Default font type and size
Test Cases Unique to Database Applications • Compatible with existing data • Testing on a copy of real data • Server capacity
Test Cases Unique to Embedded Applications • Can multiple simultaneous stimulus be handled? • Are hardware failures handled correctly? • Running out of supplies • Components breaking or degrading • Communications errors between components • Can temperature changes cause the system to behave differently?
Bug Tracking & Ship Decision • Bug states • Bug Information • Is the software ready?
After the Testing • Known problems and solutions • Defects not fixed • Shorten customer service leaning curve • Test report • Tuned to the audience • Introduction & conclusion • Major defects • What was run on which versions • What tests couldn’t be run