1 / 63

Coming up: Vote verification talk by Alan Sherman (UMBC)

Coming up: Vote verification talk by Alan Sherman (UMBC). A Study of Vote Verification Technologies. Alan T. Sherman Dept. of CSEE University of Maryland, Baltimore County (UMBC) May 3, 2006. Joint work with. Don Norris, Dept. of Public Policy, MIPAR John Pinkston, Dept. of CSEE

yonah
Télécharger la présentation

Coming up: Vote verification talk by Alan Sherman (UMBC)

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Coming up: Vote verification talk by Alan Sherman (UMBC)

  2. A Study of Vote Verification Technologies Alan T. Sherman Dept. of CSEE University of Maryland, Baltimore County (UMBC) May 3, 2006

  3. Joint work with Don Norris, Dept. of Public Policy, MIPAR John Pinkston, Dept. of CSEE A. Gangopadhyay, S. Holden, G. Karabatis, A.G. Koru, C. Law, A. Sears, D. Zhang Dept. of Information Systems National Center for the Study of Elections of the Maryland Institute for Policy Analysis and Research (MIPAR)

  4. Diebold AccuvoteTSTouch Screen Direct Recording Equipment (DRE)

  5. How well do verifiers enable voters to check their votes are • cast as intended • recorded as cast • tallied as recorded ?

  6. Overview • Evaluated 4 vote verification products • Diebold paper trail (VVPAT) • MIT-Selker audio system • Scytl Pnyx.DRE software system • VoteHere Sentinel (cryptographic receipts) • For Maryland State Board of Elections • Analysis in context of real elections • Interdisciplinary study—first of its kind

  7. Outline • Background and motivation • Voting in Maryland • Related work • Genesis of UMBC study • Verification Systems • Study systems, evaluation criteria • Analysis • Maryland Procedures • Discussion, conclusions, open problems

  8. Background and Motivation

  9. Background • Following 2000 fiasco in FL, MD moved to DREs and centralized management • Began purchasing Diebold DREs in 2001 • DREs improved accuracy and efficiency • No irregularities have been detected, but...

  10. DREs Improve Accessibility • Visually-impaired voters can use headsets, large fonts, or both • So can anyone else too

  11. Can DREs Be Trusted? • Malicious code • Subversion of system (hardware, software, OS) • Faulty design, implementation • Key management • Configuration • Data handling • Physical storage and security [Play Baxter Movie]

  12. Voting in Maryland • ~20,000 DREs (100% by fall 2006) • 23 counties + Baltimore City • Dual system of state and local control • 3.1 million registered voters (5.6 million residents) • $96 million on Diebold system by FY 2007 (~$2.82 / resident / year over 6 years) • Financially committed to Diebold through 2012

  13. What Is Special About Voting? • Critical national infrastructure • Everyone must be able to vote • Elderly, infirm, disabled (blind, deaf) • Below average IQ • Happens infrequently • Voters must have confidence in outcome • Conform to state and federal law

  14. Genesis of Study • MD General Assembly (GA) considered move toward paper trail (2005) • GA mandated study (2005) • Governor Ehrlich vetoed study • State Board of Elections commissioned study (August 2005)

  15. Study Question • How well do various vote verification products work? NOT: • What voting system should MD use? • Is the Diebold System secure?

  16. Options for Maryland • Keep Diebold, with parallel testing; continue monitoring technology • Add verification system to Diebold • Change to different system • Precinct-count optical scan (e.g., Automark, Populex) • Receipt-based system (e.g., VoteHere, Punchscan) [Discussing third option is outside study scope]

  17. Related Work • Usability study (Herrnson, et al., 2006) www.capc.umd.edu • Survey of MD voters (Norris, 2006) www.umbc.edu/mipar

  18. Diebold GEMS Server • Dedicated workstation at each LBE; Accumulates DRE votes; Generates reports

  19. Diebold GEMS Server • Dedicated workstation at each LBE; Accumulates DRE votes; Generates reports • All tallies checked by hand from printouts from each DRE of DRE totals

  20. Verification Systems

  21. Benefits of Verification • Increased assurance via independent system • Adversary must corrupt two systems • Separate tally and audit log

  22. Challenges to Verification • Adds complexity (increases cost, chance of disruption, opportunity for privacy loss) • Lack of standard interfaces • Requires modification of Diebold software • Is true system independence possible?

  23. Study Systems • Diebold VVPAT • MIT-Selker audio system • Scytl Pnyx.DRE • VoteHere Sentinel • Democracy Systems VoteGuard • Avante • IP.Com • “Parallel testing” of DREs

  24. Study Systems • Diebold VVPAT • MIT-Selker audio system • Scytl Pnyx.DRE • VoteHere Sentinel • Democracy Systems VoteGuard • Avante • IP.Com • “Parallel testing” of DREs

  25. Math Challengeon Parallel Testing: Given that B of the N DREs are bad, what is the chance of selecting at least one bad DRE in a random sample of k DREs? Solution later …

  26. Evaluation Criteria • Reliability • Functional completeness • Accessibility • Data management • Election integrity, voter privacy • Implementation / integration with DRE • Impact on voters and procedures

  27. Security Criteria • Election integrity • Ballots cast as intended • Ballots recorded as cast • Ballots tallied as recorded • Voter privacy • Resistance to disruption

  28. Study Methods • Met with vendor • Examined product in UMBC lab • Assigned numerical score for each criterion (1-low, 5-high) • Wrote narrative • We did not weight the scores to yield an overall score or product recommendation

  29. Diebold VVPAT: pros • Prints votes on paper roll • Relatively simple and intuitive • Produces physical record

  30. Diebold VVPAT: cons • Can LBEs store paper rolls securely? • Voter cannot verify what rolls used in recount • Paper roll records order of votes cast • Barcodes cannot be trusted • Lacks vendor independence • Printer jams easily • Blind cannot verify paper record, only audio output • Costly ($1,500 / add-on unit)

  31. MIT-Selker Audio System: pros • Records votes on audio tape • Easier to catch mistakes • Relatively simple • Produces physical record • Relatively simple integration • No software required • Inexpensive ($100 / unit)

  32. MIT-Selker Audio System: cons • Can LBEs store tapes securely? • Voters cannot verify what tapes are used in recount • Tape records order of votes cast • Deaf cannot use • Recount is labor intensive • Vendor lacks business plan • Needs reliable storage of magnetic media

  33. Scytl Pnyx.DRE: pros • Echoes ballot choices on confirmation screen • Stores electronic copy of vote • Well engineered • Has been used outside USA • Two-way handshake with DRE

  34. Scytl Pnyx.DRE: cons • Must trust software to store displayed vote • Can cause DRE to fail and vice-versa (via two-way handshake) • More complicated integration with DRE • Not all functionality implemented • $500 / unit

  35. VoteHere Sentinel: pros • Outstanding election integrity: voter can verify vote is recorded in official data as cast, and that tally is computed correctly from official data • Integrity based on cryptography, not computer security • Open source, high quality software • Disabled voters can enjoy same level of integrity

  36. VoteHere Sentinel: cons • Application software missing (only reference library exists) • More complicated: voter experience, conceptual model, election officials must maintain web site • Most voters will not understand the cryptography • No attempt to maintain consistency between DRE and Sentinel • $500 / unit

  37. Parallel Testing • Attempts to detect widespread corruption of DREs • Tests randomly-selected DREs on election day in simulated election • Limitations: • Can adversary “signal” selected DREs? • Number and choice of DREs for testing

  38. Probability of Selecting Bad DRE

  39. Probability of Selecting Bad DRE

  40. Summary Scores

  41. Maryland Procedures

  42. Installing DRE Software • SBE technicians install OS and application software on all DREs (critical process) • Diebold object code from Independent Testing Agency (ITA) • Cryptographic hash check performed on trusted SBE machine • DREs stored at LBEs

  43. Voter Authority Cards • Physical card at precinct for each voter • Records DRE used by voter • Poll workers may not ask for photo ID (only utility bill)

  44. Discussion, Conclusions, Open Problems

  45. Modifying Diebold Software • Needed for verification systems • Requires Diebold cooperation • Diebold not commercially motivated • Who pays? • Must pass ITA after any change

  46. Why Are Products Not Better? • Relatively small market • Lack of clear performance standards • Multitude of state and local styles for ballots and reports • Security (and accessibility) is afterthought • Emerging technologies • Funding technologies for the “social good”

  47. Vendors Should Provide • Product description • Functional specifications • Testable reference implementation • Performance data from mock election • Documentation

  48. Open Problems • Standard interfaces for verifiers • Adversarial data consistency problem • Develop/improve receipt-based systems (e.g. Punchscan David Chaum) • Performance ratings guidelines

  49. Adversarial Data Consistency Problem • (DRE and verifier honest)  tallies agree • Minimize disruption by one dishonest unit • Ex: Voter aborts in middle of process

  50. Adversarial Data Consistency Problem • Two-way communication • enables either unit to cause disruption • facilitates collusion among two dishonest units

More Related