1 / 36

Duncan A. Buell, Eleanor Hare, Frank Heindel, Chip Moore

Unsafe for any Ballot Count: South Carolina ’ s voting machines and their analysis in Ohio (2007), Florida(2006), and South Carolina (2011). Duncan A. Buell, Eleanor Hare, Frank Heindel, Chip Moore For the League of Women Voters of South Carolina. Source Material.

cece
Télécharger la présentation

Duncan A. Buell, Eleanor Hare, Frank Heindel, Chip Moore

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Unsafe for any Ballot Count:South Carolina’s voting machines and their analysis in Ohio (2007), Florida(2006), and South Carolina (2011) Duncan A. Buell, Eleanor Hare, Frank Heindel, Chip Moore For the League of Women Voters of South Carolina

  2. Source Material SC (Buell/Hare/Heindel/Moore/LWVSC), 2011 of the November 2, 2011 election data Ohio, Dec 2007, study for the Sec’y of State Florida, 2007, study after the 2006 election Burr, Rivest, et al., for NIST Ohio, Nov 2003, study for the (previous) SoS California, 2007, study for the SoS Burr, Rivest, et al., for NIST Follow links from www.lwvsc.org

  3. Why is Vote-Counting Hard? An election is a one time event—no do-overs Hard to test the scaling-up to full size Highly distributed, largely independent, using volunteer workers Vulnerable to corruption Vulnerable to disruption Highly vulnerable to error

  4. Issues and Concerns Should voters get a receipt? Are ballots indeed secret? How do we accommodate persons with disabilities? How do we handle overvotes and undervotes? Are ballots voter-verifiable? Are ballots recountable and auditable? Can we audit the results?

  5. A Common Misconception A voting machine is NOTlike an ATM. (There are laws, you have rights, there are receipts, and your money is somewhere.) A voting machine is much more like a slot machine. (What is your guarantee that the machine EVER pays out?)

  6. (Recent) History Florida’s hanging chads and butterfly ballot, 2000 HAVA (Help America’s Vote Act), 2002 Florida 13th congressional district election, 2006 Lots of complaints, some of which are known to be justified (Horry County 19 January 2008) and many of which are probably not justified.

  7. Electronic Voting Machines South Carolina: Election Systems and Software iVotronic DRE (Direct Recording Electronic) and Unity software/system for counting votes Operative study: EVEREST, submitted December 7, 2007, to the SoS of Ohio, done by UPenn and UC Santa Barbara EVEREST: the ES&S iVotronic systems “lack the fundamental technical controls necessary to guarantee a trustworthy election under operational conditions … from several pervasive, critical failures”

  8. Other Discredited Systems Diebold/Premier (RABA, Avi Rubin/JHU) Sequoia (A. Appel and Ed Felten/Princeton) Nedap (Rop Gonggrip) There are no machines that have been tested by computer experts and have not been discredited.

  9. Voting Machine Testing All machines are tested by “Independent Testing Authorities” (ITAs) But there are only a few ITAs And one was decertified for falsifying tests And none test for “computer security” issues And the paper trail shows that the same problem can occur multiple times without being fixed, but with ITA certification

  10. The Issues Security—can the system be corrupted? Quality—can the system be trusted to be correct? Human factors—can the system function as it should under normal conditions?

  11. Security

  12. Security (page 29-30) “lack the fundamental technical controls necessary to guarantee a trustworthy election under operational conditions … from several pervasive, critical failures” “…we attempted to identify practical procedural safeguards that might substantially increase the security of the ES&S system in practice. We regret that we ultimately failed to find any such procedures that we could recommend with any degree of confidence.”

  13. Security (page 29-30) “The security failings of the ES&S system are severe and pervasive. There are exploitable weaknesses in virtually every election device and software module, and we found practical attacks that can be mounted by almost any participant in an election. For this reason, the team feels strongly that any prudent approach to security ES&S-based elections must include a substantial re-engineering of the software and firmware to make it ‘secure by design’.”

  14. Security Through Obscurity? The Palm Pilot emulates a PEB and can reset all passwords. (page 66)

  15. Security Through Obscurity? (page 52) “The mechanical locks supplied … were uniformly of very low-security designs that can easily be picked …” “For the first weeks of the project, we did not have the correct keys for much of the equipment; we frequently had to pick the locks in order to conduct our analysis.”

  16. Software Quality

  17. Software Quality • Writing bad, confusing, un-maintainable, and sloppy code is not that hard. • Writing clean, professional, maintainable, secure, code that is and secure and does exactly and only what it’s intended to do is very hard. • What we would simply mark off in a freshman’s work would be unacceptable from a senior.

  18. Software Quality “a visible lack of sound software … practices” “a buggy, unstable, and exploitable system”

  19. The ES&S System (page 84) • 515,000 lines of code • Nine programming languages • Four hardware platforms A large and complicated computer system by any standard

  20. Code Analysis (pp. 53ff, 83ff) All code modules have buffer overflow bugs. “Avoiding buffer overflow bugs in input processing is regarded as one of the most basic defenses a system must have.” About 63% of the code is in memory-unsafe programming languages. Compilation on Visual Studio 2005 fails unless one turns off modern security standards.

  21. Code Analysis (pp. 53ff, 83ff) Fortify (a standard code analysis program) finds hundreds of vulnerabilities in the source code, which indicates “that the vendor did not sufficiently validate their code.” In grading CSCE 240 undergraduate homework, I take off 20% for EACH use of a memory-unsafe function.

  22. Passwords (Florida excerpt) • Passwords are hard coded in the firmware, identical in every machine. • An undocumented back door exists. • “This represents poor practice” • “These passwords provide very little security.” • “poorly conceived and poorly implemented” • Passwords are coded in the clear in devices. • Crypto keys are stored in the clear.

  23. Passwords (Florida excerpt) “The Service Menu password, Clear and Test password, ECA password, and Upload Firmware password are three-letter case-insensitive passwords. Each one is chosen to be mnemonic and easy to remember. The problem is that they are also likely to be fairly easy to guess. They follow a memorable pattern. Someone who knows one of these passwords can probably guess what the other ones are without too much difficulty.”

  24. Ballot Image Randomization (page 73) • The iVotronic “uses a weak randomization procedure” that “does not properly randomize voter selections in its audit logs”. • Random number generation is a well-established mathematical and computational science. NIST even publishes a testing document and test suite (Publ. 800-22). • Failing to use proper, tested, RN generators is just unprofessional and sloppy.

  25. Software Quality Summary • These software problems are common in the code written by first-year students. • A first-year student’s A grade (for submitting code that ostensibly worked) would probably drop to a C for these errors. • A senior student’s A grade (for submitting code that ostensibly worked) would probably drop to an F.

  26. Human Factors

  27. Human Factors Duncan Buell, Eleanor Hare, Frank Heindel, Chip Moore FOIA-d data from several counties, including Richland, Charleston, Colleton, Lancaster, Berkeley, Lexington, Sumter, Florence We have tried to reconcile the certified official counts with the counts that are supported by the data. We have yet to find a county whose numbers add up properly. :

  28. LWVSC Press Release, 14 Feb 2011 http://www.lwvsc.org

  29. The State Newspaper

  30. What Actually Happened? Two paths to “the truth” of the count: • PEBs collect totals from machines and these totals go into a master file (acting like slot machine tapes) • Individual vote data is collected from memory cards and goes into a vote image file (acting like the cash drawer) We tried to verify that these two truths were the same.

  31. What Actually Happened? We found four different errors: • Memory cards not collected, so individual votes were not in the vote image file. • Two entire precincts were missing from the vote image file. • TWO PEBs (not one) were used to collect data in Ward 21, but only one had its totals uploaded. • SIX machines were not closed in Bluff and their data not collected until 11/9/2011. • 1127 votes not counted, 2800 votes without support

  32. What Next? Hm.m.m.m.m.m…. :

  33. What Next? Chip Moore and I are donating our code (Perl and Java), and I meet with R County on Thursday. LWVSC proposes a statewide mandate for this kind of audit: • Each machine used should be verified to be closed and its data collected. • Each PEB used should be accounted for and its data collected. • The PEB totals should match the vote image totals. Not rocket science: mine was an afternoon coding & a minute on my laptop for all of Richland County. :

  34. LWVUS Positions SARAT--Voting systems must be Secure, Accurate, Reliable, Accessible, and Transparent, and voting systems must provide a paper ballot or record of the voters intent that the voter can verify during the voting process and that can be used for random audits and recounts. (LWV, Impact on Issues 2006-2008, p.11)

  35. LWVSC Positions Voting machines must • include a paper audit trail that allows the voter to verify his/her vote and provides a reliable basis for a recount if required • be randomly tested during every election • use source code that is open for inspection. LWVSC believes that SC’s iVotronics do not meet these criteria

  36. The End

More Related