1 / 40

Designing Interfaces for Voting Machines

Designing Interfaces for Voting Machines. Benjamin B. Bederson Computer Science Department Human-Computer Interaction Lab University of Maryland www.cs.umd.edu/~bederson bederson@cs.umd.edu February 4, 2005. Frustrated voters.

wendellc
Télécharger la présentation

Designing Interfaces for Voting Machines

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Designing Interfaces for Voting Machines Benjamin B. Bederson Computer Science Department Human-Computer Interaction Lab University of Maryland www.cs.umd.edu/~bederson bederson@cs.umd.edu February 4, 2005

  2. Frustrated voters • Voting technology and ballot design can influence election outcomes • Minorities and the poor are more likely to cast their ballots on outdated systems • Technology is in need of updating

  3. When Interfaces Get in the Way • Ballot design • Butterfly ballot • Interaction • Hanging chad • Changing vote (i.e., how to unselect a candidate) • Write-In problems • 2004 - NY Times Editorial reported on San Diego mayoral election where voters for write-in candidate Frye didn’t darken a bubble. • 2002 - Mt. Airy, MD mayor went from Holt to Johnson to Holt, based on acceptable spellings.

  4. Usability Part of Larger Issues • Florida 2000 – Traditional technologies flawed • Mechanical levers – break down, difficult to maintain, difficult to store and transport • Paper ballots – errors, difficult to process and interpret • Punch cards – hanging chad, etc. • Economics de-emphasizes usability • Focus on security de-emphasizes usability • Lack of research because of proprietary systems and number of designs

  5. Our Study • Funded by • NSF (National Science Foundation), Grant #0306698“Project to Assess Voting Technology and Ballot Design” • Carnegie Corporation, Grant #D05008 • Consists of: • Expert review <= Focus today • Lab study <= Focus today • New technology <= Focus today • Field test • Natural experiments • Co-Researchers with • Paul Herrnson – Univ. of Maryland (project leader) • Michael Traugott & Fred Conrad – Univ. of Michigan • Richard Niemi – Univ. of Rochester Small-scale studies to demonstrate potential challenges and inform future research Does not address accuracy, affordability, accessibility, durability, or ballot design This represents partial results mid-way through a 3 year study. Future work will address accuracy, ballot design, and more

  6. Partners • Federal Election Commission (FEC) • Maryland State Board of Elections • National Institute of Standards and Technology (NIST) • Vendors • Diebold • Hart InterCivic • ES&S • NEDAP • Avante • Advisory Board

  7. Machines Looked At • Avante Vote Trakker • Diebold AccuVote TS • ES&S Optical Scan • Hart eSlate • NEDAP LibertyVote • UMD Zoomable system As available for testing. Some machines have been deployed with different options. Some machines have since been updated. Vendors (except NEDAP) implemented ballots for best presentation. Machines selected to represent specific features

  8. Avante Vote Trakker All photos taken by our research group – not provided by vendors.

  9. Diebold AccuVote TS

  10. ES&S Optical Scan

  11. Hart eSlate

  12. NEDAP LibertyVote

  13. UMD Zoomable Systemwww.cs.umd.edu/~bederson/voting Demo

  14. Expert Review • 12 HCI experts one evening • 1 voting interaction specialist • 1 government usability practitioner • 5 academic HCI researchers • 6 private usability practitioners • Each used • 6 machines • 2 ballot types where available (office block, party column) • ~15 minutes each • Asked to list concerns • Followed worst case perspectives of • novice • poor language skills • older voters • stressed voters • system errors Most experts did not have background in voting systems Subjective responses require interpretation

  15. Expert Review Rating System • Each issue given a severity rating(1-low, 5-high) • Concerns listed with average severity, # of instances

  16. Avante VoteTrakker Concerns (average severity, number of instances) 5.0 1 Write-in requires last name 4.0 2 Record shown too fast and without instructions 4.0 2 No previous button1 3.0 2 Auto-forward confusing1 3.0 1 Smiley face inappropriate 3.0 1 Title too small 3.0 1 Instruction text poorly written 3.0 1 Didn't like this one at all 3.0 1 "Cast ballot to continue" not clear - it actually finishes 1 Navigation focuses on progress with later review by design

  17. Avante VoteTrakker (more I) Concerns (average severity, number of instances) 3.0 1 Timed out, but didn't see warning 3.0 1 Angle of machine is awkward 3.0 1 Lot of reflection on screen 3.0 1 Flashing instruction is distracting 3.0 1 Colors of text poor (green/white, black/blue) 3.0 1 No progress feedback 3.0 1 No way to cancel and leave 2 3.0 1 No way to start over 3.0 1 "Please make selection" message is distracting 3.0 1 no error-checking on write-in 2 Can time-out to cancel

  18. Avante VoteTrakker (more II) Concerns (average severity, number of instances) 3.0 1 Write-in association very small 3.0 1 No way to go to end and cast ballot 3 3.0 1 Lack of color on amendment screen may appear to be an error 3.0 1 Disabled button is "white" which is very difficult to understand 3.0 1 Cast ballot button requires 2 presses 3.0 1 Can't say "no" to paper record - so why bother? 3.0 1 Have to pick contrast/text size before starting 3.0 1 No instructions after starting 2.0 1 Not clear what to do at beginning 3 By design to minimize under-votes

  19. Diebold AccuVote TS Concerns (average severity, number of instances) 5.0 1 Ballot review confusing. Review colors don't match voting colors 5.0 1 No help on some screens 5.0 1 Write-in has no instructions 4.0 1 Contrast and text size controls not clear 4.0 1 Some font colors unclear (black on blue, red/blue) 4.0 1 Party not clearly indicated 4.0 1 Difficult to use while seated 4.0 1 Large font is good, but "issues" text runs over screen display area requiring arrow navigation 3.0 2 Wait icon is too computerish and not clear 3.0 1 Card hard to enter

  20. Diebold AccuVote TS (more) Concerns (average severity, number of instances) 3.0 1 Poor depiction of voting vs. reviewing state 3.0 1 "Card not inserted" error needs a diagram 3.0 1 Buttons have poor visual affordance 3.0 1 Instructions refer to "backspace" key, but is actually labeled "back" 3.0 1 Instructions unclear (i.e., "Vote for one") 3.0 1 Some text unclear (i.e., "2 of 4") 3.0 1 Multiple write-in unclear 3.0 1 Write-in not well associated with race being voted 1.0 1 Extra dots on help/instruction screens

  21. ES&S Optical Scan Concerns (average severity, number of instances) 5.0 1 Instructions not mandatory, errors likely 5.0 1 Write-in has high error mode (enter name, but not fill in circle) 4.0 2 Changing vote process is punitive - must start over which could cause some to give up 4.0 1 Poor visual grouping (title could be associated with items below) 4.0 1 Could fold, bend or tear ballot 4.0 1 No instructions to review ballot before submitting 4.0 1 Instructions to turn over page not conspicuous enough 3.5 2 Font size is fixed, and will be too small for some older and other voters 3.5 2 No error checking on under-vote 3.0 2 No error checking on over-vote

  22. ES&S Optical Scan (more) Concerns (average severity, number of instances) 3.0 1 Should use different highlight/feedback that vote was correct 3.0 1 Why two sets of matching instructions? 3.0 1 Instructions somewhat difficult for voters with limited English proficiency 3.0 1 Instructions should say something about no extra marks on ballot 2.7 3 Needs a better table - low and shaky 1 2.0 1 Seated operation awkward 1.0 1 "Vote in next column" unclear 1.0 1 Appears to be an entry field at top of column 1 Different cost/quality tables available

  23. Hart eSlate Concerns (average severity, number of instances) 5.0 1 Combining summary and cast ballot confuses actual casting 4.0 1 No way to jump to end 4.0 4 Dial slow to learn, hard to use 1 4.0 1 Red on blue text, and light fonts hard to read 3.5 2 After reviewing, it's hard to get back to a choice to change it 3.5 2 Blue movement on screen is disconcerting 3.0 1 Cast ballot button didn't accept push - required 3 presses 1 Compare to subjective/objective data later

  24. Hart eSlate (more) Concerns (average severity, number of instances) 3.0 1 Poor progress indicator 3.0 1 May confuse with a touch screen 3.0 1 Can't clear entire vote and start over in one step 3.0 1 Write-in screen does not indicate office being voted for 3.0 1 Next/Prev and Dial ambiguous 3.0 1 Auto-forward on select, but not unselect (inconsistent interface)

  25. NEDAP LibertyVote Concerns (average severity, number of instances) 5.0 2 Write-in message after OK is confusing 5.0 2 No way to confirm/review write-in name 5.0 1 "No vote" light should be different color (difficult to see what wasn't finished) 5.0 1 No clear way to handle multiple write-ins 5.0 1 Poor feeling of privacy due to size 4.5 2 "Enter write-in" button doesn't seem to work 4.3 3 Under-vote message easy to miss 4.0 3 OK button for write-in too far away 4.0 2 Too much reflection 4.0 1 OK button with 4 arrows is weird 4.0 1 Propositions too far away 4.0 1 Hard to read/access from seated position

  26. NEDAP LibertyVote (more) Concerns (average severity, number of instances) 4.0 1 Number pad unclear - what is it for? 4.0 1 Blue light coding (voted/unvoted) unclear 4.0 1 "enlarge" scrollbar un-obvious (to left of little message screen) 4.0 1 Buttons hard to press with poor tactile feedback 4.0 1 Scroll bar thing to right of message box unclear 3.7 3 Difficult to correct a vote 3.5 2 Write-in area too far away 3.0 1 "Partisan offices" unclear terminology 3.0 1 Can change language accidentally 3.0 1 Same color for race and candidate is unclear 3.0 1 Prefer sequence to "jump around" model of full face ballot 2.0 1 No second chance to cast vote - review is implicit

  27. NEDAP Actual Ballot

  28. UMD Zoomable Concerns (average severity, number of instances) 4.0 3 Color of review & cast ballot buttons should be different than progress indicator and selected items 3.0 1 Not clear how to get started 3.0 1 Feels like a game - possibly inappropriate 3.0 1 "Not voted" confusing when multiple choices available 3.0 1 Peripheral races too visually confusing 2.5 2 Progress/navigation buttons is partly a progress indicator, but not clear enough 2.0 1 Overview buttons shouldn't split 4 sub-types

  29. Lab Study • 42 members from Ann Arbor, MI voted on 6 machines • Paid $50 for 1-2 hours • Different Random orders for different people • Latin Square design • Over selected for potential difficulty • Most (69%) >= 50 years old • Most (62%) uses computers once very 2 weeks or less • Most (30) voted on office-block ballot • Indicated intention of (fictional) candidates by circling names on paper form • Study not controlled for prior experience, but Ann Arbor uses optical scan • Data: • Satisfaction ratings reported after voting on each machine • Time measurement • Videotaped interactions

  30. Lab Study (more) • Looked at: • Time voters spend reading instructions • Response to paper or on-screen ballot • Response to the reporting of under- or over-voting • Ability to change a vote • Complications and malfunctions of DRE or Optical Scan Readers

  31. Lab Study – Satisfaction Data • Usability studies typically measure: • Speed, Accuracy, Satisfaction • We are currently reporting on two(Speed, Satisfaction)

  32. “The voting system was easy to use”

  33. “I felt comfortable using the system”

  34. “Correcting my mistakes was easy”

  35. “Casting a write-in vote was easy to do”

  36. “Changing a vote was easy to do”

  37. Lab Study - Time to Cast Ballot

  38. Lab Study – Analysis Remains • Why are some machines consistently most preferred and others least preferred? • Detailed coding of video interactions underway • Planned analyses of video interactions: • Tally of problems by machine that do and do not lead to unintended votes cast • Explanation of satisfaction data in terms of voters’ actions • Remember that usability is only one characteristic of overall performance • Accuracy, Accessibility, Affordability, Durability, Security, Transportability, etc.

  39. Future Parts of the Project • Field Test • Assess usability among large, more representative sample • Assess impact of ballot designs on usability issues • Assess accuracy on different systems • Natural Experiments • Assess impact of voting systems and ballot designs on over-voting, under-voting, straight-party voting, and other measures across jurisdictions and over time • Assess impact of changing from specific types of voting systems (or ballots) to another system (or ballot)

  40. Implications and Reflections • Voter intention is the key goal • Usability is as important as security(and so is accuracy and accessibility as well as affordability and durability) • Being able to update interface is important (i.e., certification may be interfering with usability) • Ballot/machine combination important (i.e., one size doesn’t fit all) This talk available with vendor’s responses www.capc.umd.edu

More Related