1 / 46

Testing and Playtesting

Testing and Playtesting. CIS 487/587 Bruce R. Maxim UM-Dearborn. Testing Objectives. Testing is the process of executing a program with the intent of finding errors. A good test case is one with a high probability of finding an as-yet undiscovered error.

tommy
Télécharger la présentation

Testing and Playtesting

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Testing and Playtesting CIS 487/587 Bruce R. Maxim UM-Dearborn

  2. Testing Objectives • Testing is the process of executing a program with the intent of finding errors. • A good test case is one with a high probability of finding an as-yet undiscovered error. • A successful test is one that discovers an as-yet-undiscovered error.

  3. Testing Principles • All tests should be traceable to customer requirements. • Tests should be planned before testing begins. • 80% of all errors are in 20% of the code. • Testing should begin in the small and progress to the large. • Exhaustive testing is not possible. • Testing should be conducted by an independent third party if possible.

  4. Software Defect Causes • Specification may be wrong. • Specification may be a physical impossibility. • Faulty program design. • Program may be incorrect.

  5. Good Test Attributes • A good test has a high probability of finding an error. • A good test is not redundant. • A good test should be best of breed. • A good test should not be too simple or too complex.

  6. Test Strategies • Black-box or behavioral testing • knowing the specified function a product is to perform and demonstrating correct operation based solely on its specification without regard for its internal logic • White-box or glass-box testing • knowing the internal workings of a product, tests are performed to check the workings of all possible logic paths

  7. Strategic Approach to Testing - 1 • Testing begins at the component level and works outward toward the integration of the entire computer-based system. • Different testing techniques are appropriate at different points in time. • The developer of the software conducts testing and may be assisted by independent test groups for large projects. • The role of the independent tester is to remove the conflict of interest inherent when the builder is testing his or her own product.

  8. Strategic Approach to Testing - 2 • Testing and debugging are different activities. • Debugging must be accommodated in any testing strategy. • Need to consider verification issues • are we building the product right? • Need to Consider validation issues • are we building the right product?

  9. Strategic Testing Issues - 1 • Specify product requirements in a quantifiable manner before testing starts. • Specify testing objectives explicitly. • Identify the user classes of the software and develop a profile for each. • Develop a test plan that emphasizes rapid cycle testing.

  10. Strategic Testing Issues - 2 • Build robust software that is designed to test itself (e.g. use anti-bugging). • Use effective formal reviews as a filter prior to testing. • Conduct formal technical reviews to assess the test strategy and test cases.

  11. Stages of Testing • Module or unit testing. • Integration testing, • Function testing. • Performance testing. • Acceptance testing. • Installation testing.

  12. Regression Testing • Check for defects propagated to other modules by changes made to existing program • Representative sample of existing test cases is used to exercise all software functions. • Additional test cases focusing software functions likely to be affected by the change. • Tests cases that focus on the changed software components.

  13. Testing • Stages of game testing are similar to those of all software testing: • Unit testing • Integration testing • Validation testing • Usability testing • System or Performance testing

  14. Test Plans • Goals and objectives • Test Strategy • Components to be tested • Resources • Schedule • Work products • Testing procedures (tactics) • For completing including test cases for all phases of testing

  15. The next 8 slides comefrom the Rabin text

  16. The Five StepDebugging Process 1. Reproduce the problem consistently 2. Collect clues 3. Pinpoint the error 4. Repair the problem 5. Test the solution

  17. Step 2:Collect Clues • Each clue a chance to rule out a cause • Each clue a chance to narrow down the list of suspects • Realize that some clues can be misleading and should be ignored

  18. Step 3:Pinpoint the Error Two main methods: 1. Propose a Hypothesis • You have an idea what is causing the bug • Design tests to prove or disprove your hypothesis 2. Divide and Conquer • Narrow down what could be causing the bug • Eliminate possibilities from the top down or • Backtrack from the point of failure upward

  19. Step 4:Repair the Problem • Propose solution • Consider implications at point in project • Programmer who wrote the code should ideally fix the problem (or at least be consulted) • Explore other ways the bug could occur • Ensure underlying problem fixed and not just a symptom of the problem

  20. Expert Debugging Tips • Question assumptions • Minimize interactions and interference • Minimize randomness • Break complex calculations into steps • Check boundary conditions • Disrupt parallel computations • Exploit tools in the debugger • Check code that has recently changed • Explain the bug to someone else • Debug with a partner • Take a break from the problem • Get outside help

  21. Tough Debugging Scenarios • Bug exists in Release but not Debug • Uninitialized data or optimization issue • Bug exists on final hardware, not dev-kit • Find out how they differ – usually memory size or disc emulation • Bug disappears when changing something innocuous • Timing or memory overwrite problem • Intermittent problems • Record as much info when it does happen • Unexplainable behavior • Retry, Rebuild, Reboot, Reinstall • Internal compiler errors • Full rebuild, divide and conquer, try other machines • Suspect it’s not your code • Check for patches, updates, or reported bugs • Contact console maker, library maker, or compiler maker

  22. Adding Infrastructureto Assist in Debugging • Alter game variables during gameplay • Visual AI diagnostics • Logging capability • Recording and playback capability • Track memory allocation • Print as much information as possible on a crash • Educate your entire team • testers, artists, designers, producers

  23. Prevention of Bugs • Set compiler to highest warning level • Set compiler warnings to be errors • Compiler on multiple compilers • Write your own memory manager • Use asserts to verify assumptions • Initialize variables when they are declared • Bracket loops and if statements • Use cognitively different variable names • Avoid identical code in multiple places • Avoid magic (hardcoded) numbers • Verify code coverage when testing

  24. Game Test Cycles - 1 • Alpha test cycles • Can be applied when game product has a least one playable logic path • Basic user interface should be complete • Game should run on minimum hardware configuration • Test multiplayer functions if called for • Game installer should work and draft manual should exist

  25. Game Test Cycles - 1 • Alpha test cycle objectives • Test all modules for current product version • Create a defect database and test plan • Record known defects and performance test results

  26. Game Test Cycles - 2 • Beta test cycles • Applied after all game features and options have been implemented • Game allows testers to navigate all logic paths to allow removal of most game termination defects • Final game interface, artwork, and audio should be present • Final game installer and manual exist

  27. Game Test Cycles - 2 • Beta test cycle objectives • Isolate significant defects and performance problems • Complete testing, defect removal, and performance testing • Complete compatibility testing

  28. Game Test Cycles - 3 • Gold release candidates • Applied after senior management has reviewed product and defect database • All program features will be present • Performance is appropriate for release • Online documentation is complete • All major defects removed • Game runs on all supported platforms

  29. Game Test Cycles - 2 • Gold release candidate objectives • Successful playtesting will occur • Similar to acceptance testing, except that publisher determines criteria not end users • End product will be potentially shippable • Gold release candidate will be classified as a baseline work product (therefore changes are quite risky and must be approved by all stakeholders)

  30. Acceptance Testing • Purpose of acceptance testing is to have customers work with software in their own environment • Determine that software that requirements have been met • Largely a highly scripted affair, since it determines whether developers will get paid or not

  31. Playtesting • Game testers play the game and try to user game features in the same way as end users • Use game manual and see if game installs according to written directions • See if game plays according to game manual instructions • Playtesting needs to take place without coaching

  32. Playtesting Goals • Playtesting follows software testing • Is the game fun? • Are there problems with game mechanics? • Software Testing + More • Tuning game play • Tuning game flow experience • Does the player the point of the game?

  33. Pragmatics • Playtesting is not the same as debugging (which is very code oriented) • Better to have a game that plays well and crashes occasionally, than a game that runs great and has a boring ending • Testers must feel that developers are reacting to their feedback and criticism in an egoless manner

  34. Who is the right playtester? • Need a person who can explain the reasons behind his or her likes and dislikes • General impressions (e.g it was fun) are not very helpful • It is helpful to involve first time users since they can often spot usability problems and poor game play quickly

  35. Possible Testers • Development team members and test team members may not be the best play testers (if they don’t match intended game audience profile) • They are motivated to improve the product and know what needs to be tested

  36. Team Structure • Fundamental issue create willful blindness • Have testing team members be different from your coders is vital • Separate software testing team • Separate playtesting team • Separate SQA group(?)

  37. Playtesting Team • Internal team • Manages playtesting people • Does own playtests • Recruits outside testers • External team • Representative sample target players • Varying skills • Unfamiliar with product • Not in love with your product! • Not yet bored with the product

  38. External Testers • Do they like the game? • They may lie to you • Politeness effect • If they can’t say why they like a game feature, you have a problem • When/where do they get frustrated? • What common areas? • Often skill-based • Internal team judges player’s skill, also

  39. Focus Groups • Representative sample of target audience for game product • Need to determine what feedback is needed back from target audience • Can provide feedback on key screen displays, storyboards, interface prototypes early in the design process • Early feedback on gameplay can shape direction of final product with less rework

  40. Unguided Testers play game without restrictions Better used later in the development process Guided Testers given specific tasks to accomplish Used early in development or during regression testing Guided or Unguided

  41. Outcomes • When play testers can not progress in game the designer should try to figure out the reason • If the game is too hard, it needs to be simplified • If play testers are doing unexpected things and the results are entertaining the new features should be added to the game

  42. Case study fromC.D. Shaw

  43. External Testers: Half-Life • People near Valve’s offices who sent in registration cards • Designers sit quietly while player struggles • Designer takes notes • Typical 2-hour session • 100 action items • First 20-30 sessions absolutely vital • Learn what was fun • 200 sessions total

  44. Half-Life: Fine tuning • Add instrumentation • Player position, time, health, weapons • Activities: • Game save, dying, being hurt, solving a puzzle, fighting a monster… • Graph series of sessions together • Spot too long with no encounters • Spot long periods of too much health • Spot long periods of too little health

  45. Half-Life • Most important playtesting outcome: • Clearly identified good and bad ideas • Playtesting provides the evidence needed to abandon bad ideas

  46. Advice • Don’t get defensive! • The tester opinion is important • Testers: Stick to your findings! • Respectfully point out problems • Mix up the hardware • Be honest about the specs on the box • Don’t let bad decisions live forever!

More Related