1 / 13

Usability and Accessibility Test Methods: Preliminary Findings on Validation

Usability and Accessibility Test Methods: Preliminary Findings on Validation. Sharon Laskowski, Ph.D. Manager, NIST Visualization and Usability Group, Information Access Division, ITL http://vote.nist.gov. U&A Test Methods: Face Validity.

joshua-gay
Télécharger la présentation

Usability and Accessibility Test Methods: Preliminary Findings on Validation

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Usability and Accessibility Test Methods:Preliminary Findings on Validation Sharon Laskowski, Ph.D. Manager, NIST Visualization and Usability Group, Information Access Division, ITL http://vote.nist.gov

  2. U&A Test Methods: Face Validity • Does the test method test the requirement and does the requirement improve usability/accessibility? • U&A requirements and tests based on 30 years of best practice human factors research and design applied to user interfaces in similar domains • Comments by the public, test labs and manufacturers on proposed test methods

  3. Goal: Procedural Validity of the Tests • Reliability: correct and reproducible • Do the tests produce true pass/fail determinations, for any given system, independent of testers and test labs? • Primary concerns • Are test procedures clear, complete, and easy to execute? • Are special tester qualifications required?

  4. Validation Team • Design and Usability Center, Bentley University • Testers: students enrolled in the MS in Human Factors (HF) Program • Range of experience in usability testing, user interface and web design, media • Advisors: • HF Ph.D. researcher, usability metrics • Senior usability consultant

  5. Validation Process: Scope • VVSG test methods for design requirements only • Did not have manufacturer summative usability test reports (no Technical Data Package) • Did not test with alternative languages • Next phases • Accessibility throughout the voting session usability test • Documentation usability/poll worker usability test • Usability performance benchmark test • Any additional VVSG 1.1 design requirements • Validate test for evaluating manufacturer reports

  6. Validation Process: Protocol • 2 voting systems: accessible DRE, optical scanner • Round 1 • 4 individual testers, then team pairs executed the tests • Recorded pass/fail decisions, their confidence in the decisions, and any problems that arose • Round 2 - ongoing • 1 team, detailed recording of test execution, measurements, observations, and pass/fail decisions • Analysis and feedback on test methods • Meta-analysis of process and comparison of pass/fail decisions against expected outcomes

  7. Preliminary Findings: Validation Process • In general, testers understood intention of the requirements and the HF focus • HF testers need to have clear understanding of validation and certification process • Test execution with P/F + evaluation of the test method • Partial interpretation of some tests useful, e.g., no logging of alt. language used • Materials to support recording of all data important • We enhanced the test method documentation and added detailed data collection forms for each system

  8. Qualifications • Testers had detailed knowledge and experience in usability and accessibility, evaluation of user interfaces, and best practices • Few questions concerning the requirements and how to evaluate • More training and knowledge of voting systems and certification would have improved Round 1 • Validation and execution of tests depend heavily on contextual knowledge • E.g., there are many system acronyms and multiple documents (VVSG, test methods, glossaries, etc.)

  9. Tester Skill Sets • Some requirements have test methods that require skills/knowledge atypical for HF professionals, e.g., • Use of photometers or oscilloscopes • Measuring contrast ratios, saturation, screen flicker, and sound levels • Might some of these tests be assigned to in-house VSTL testers? • But not obvious, e.g., measurements for wheelchair reachability may require some HF expertise

  10. Specific Test Issues • Based on Round 1 and information from VSTLs • What is the ideal bystander distance for privacy testing? • What is the ideal distance from screen for measuring font size with a magnifier? • What is the best way to simulate 20/70 farsighted vision? • What constitutes “adequate notification” of failure • Opportunistic test if particular failure state arises • Will verify lack of problems in some areas, e.g., plain language requirements in Round 2

  11. Workflow • Workflow of usability/accessibility test methods was designed to optimize the process • Designed to test multiple requirements • Conduct functional and design tests before usability tests with test participants • VSTLs pointed out that some tests require a new election definition be installed • These should be minimized and incorporated into other types of tests where possible • E.g., overvote/undervote warning tests

  12. Next Steps • Collect and analyze results from Round 2 validation • Determine correctness of pass/fail decisions • Revise test methods based on findings • Clarity • Specificity • Identification of tests that do not require HF expertise • Placement in workflow • Revise qualifications draft document • Continue validation of remaining test methods and additional tests for changes to VVSG 1.1 and 2.0

  13. Discussion/Questions Page 13

More Related