“Summary of Electronic Voting System Analysis.”

Duke University PhD candidate Justin Moore recently wrote an analysis of the undervote rates of voting systems in use during last month’s elections, for submission to the Hugo Commission. I found it so fascinating that I asked permission to reproduce it here. Justin kindly obliged, so here it is.

Summary of Electronic Voting System Analysis
Justin Moore
November 21, 2005

This subcommittee has undertaken a study of one of the most fundamental aspects of government: how citizens select their representatives and shape their government. I was privileged to observe a similar process — from start to finish — in my current home of North Carolina, but was thankful for the opportunity to participate in the deliberations of my native Virginia. As a computer engineer with a background in distributed systems, security, and software development, I hope to contribute solid, constructive, and scientific data to the subcommittee. Unfortunately, my analysis shows that Virginia has been too hasty in its acceptance of paperless voting machines, and that this haste likely affected the result of at least one race in Virginia in 2005. Additionally, it is likely that these subtle failures — detectable through simple statistical analysis — are unrecoverable: even though one candidate will become Attorney General, the true victor will never be known. These are not wild-eyed, tin-foil-hat-wearing conspiracy theories; it is merely the logical result of using a low-quality product that meets poor development standards.

I realize committee members have been inundated with page after page of statements, position papers, and studies. My goal is simple: to remind members that the most important and most fundamental property of a good voting system is that it be reliable. Even the most disabled-accessible and well-liked system is useless if it is malfunctioning. During the hearing on August 21st — and in written statements since — I saw many opponents of voter-verified paper ballots discuss how much voters and election officials like paperless systems, how few complaints they received about paperless systems, and how easy these paperless systems were to use.

However, this “popularity” metric is next-to-useless in the engineering world. Engineering is not a popularity contest. It is a meritocracy, with the best ideas winning. We, as computer scientists, know how to write solid programs; we write them to fly our airplanes, to run our pacemakers, and to control military weapons. We do not, however, know how to write solid programs “on the cheap”. A typical software project as complex as a paperless voting machine costs billions of dollars to develop. The control code for the Boeing 777 — at one-tenth the complexity of Microsoft Windows — cost over $2 billion to develop, of which nearly $1 billion was spent on quality assurance and testing.

Contest Governor Lieutenant Governor Attorney General
Optical Scanners 1.13% 2.82% 2.85%
DRE Machines 1.21% 4.28% 3.99%
Lever Machines 1.16% 6.45% 6.26%
Table 1: Undervote rates in the 2005 Virginia General Election; smaller is generally better. Note that the exception of the Gubernatorial race, with its low undervote race across all technologies — undervote for counties using paperless machines are nearly 50% higher than those using optical scanners.

The bottom line is that voting machine vendors do not have the resources necessary to develop software that is anywhere as near as reliable as these critical systems. Even the latest proposed standards from the Election Assistance Commission require that a modern voting machine be failure-free for only 163 hours, on average. This standard — unchanged since 1990 — means that nearly 10% of all voting machines can experience a failure during Election Day and still meet or exceed federal standards.

These weak standards do nothing to weed out bad machines and, not surprisingly, more complex machines will experiences failures, abnormalities, and glitches more often. Table 1 provides evidence to back this hypothesis. On average, optical scan systems recorded a larger percentage of votes than paperless systems. Analysis of the raw data from the State Board of Elections website — as of 3:50 AM, EST, November 21, 2005 — shows that the single most significant factor in predicting undervote rate is the technology used by the voter. (An undervote is a ballot in which the voter did not make a selection for that particular race.) Our two options are to conclude that somehow voters in counties using paperless systems care less about politics and are less likely to vote in a race for which they went to the polls, or that the technology used by those voters is of lesser quality than that used in other counties.

Table 2: Undervote rates in the 2004 Virginia General Election; smaller is generally better. Note that — with the exception of the Presidential race, with its low undervote race across all technologies — undervote rates for counties using paperless machines are nearly 30% higher than those using optical scanners.
Contest President Amendment One Amendment Two
Optical Scanners 0.89% 12.45% 10.66%
DRE Machines 1.15% 15.51% 13.07%
Lever Machines 1.04% 30.23% 29.85%

We can test this by examining the results of the 2004 General Election in Virginia. Table 2 has these results. Again, paperless machines record fewer votes than optical scan systems. Since 7 counties switched from optical scan systems in 2004 to paperless systems in 2005 and saw a relative increase in their undervote rates, we can conclude it is not the voters: it is the technology.

Figure 1: Correlation between undervote rates in the Lieutenant Governor’s race and the Attorney General’s race in Virginia in 2005. Note the strong correlation for optical scan-using counties; if 3% of the voters didn’t cast a vote for Lieutenant Governor, about 3% of voters also didn’t cast a vote for Attorney General. This is consistent with voters who only care about the top-ballot race: Governor. However, in DRE-using counties, there is little to no correlation, and higher undervote rates. This suggests strongly there is an underlying problem with the machines.

The last piece of evidence is another peek at voter behavior. The 2005 election cycle in Virginia was very Governor-centric; there was little enthusiasm or even interest in the race for Lieutenant Governor or Attorney General, as most of the attention was on Tim Kaine, Jerry Kilgore, and (to some extent), Russ Potts (and his pots-banging ad campaign). Given this, we would expect there to be two classes of voters: those who vote for Governor only, and those who will vote a full ticket.

This would produce a voting pattern in which the number of undervotes in the Attorney General’s race and Lieutenant Governor’s race would be roughly the same.

Here, we chart the data. If 2% of voters who went to the polls in a given locality didn’t vote for Attorney General, and 3% of the voters in that same locality didn’t cast a vote for Lieutenant Governor, we would place a mark at position (2; 3). Given our prediction about voting patterns, we expect these numbers to be roughly the same within each locality.

Figure 1 charts this relationship. In counties using optical scanners, we see the expected correlation: a straight line of points. The largest difference is in Loudoun County, in which 1.5% of voters did not choose a Lieutenant Governor and 2.1% of voters did not choose an Attorney General. The “most apathetic” optical scan locality is Fredericksburg City, in which 6.2% of voters cast no ballot for Lieutenant Governor or Attorney General.

Localities using paperless machines are a completely different story. One-fourth of DRE-using localities had a discrepancy larger than Loudoun County. A dozen localities had undervote rates higher than Fredericksburg City. Either voters in these localities are less consistent and erratic than those in optical-scan counties, or the technology has problems. It is also important to note that these results already take into account the processes and procedures election officials use to prevent the loss of votes.

Had DRE systems performed as reliably as optical scan systems in Virgina this month, there would be another 12,430 votes for Lieutenant Governor and another 9,400 votes for Attorney General. It is entirely likely that the malfunction of paperless machines in Virginia in 2005 affected the outcome of the race for Attorney General. The margin of error is larger than the margin of victory by over a factor of 25, and no amount of popularity or customer satisfaction can change this likelihood.

In conclusion, I recommend that Virginia mandate a voter-verified paper ballot for every voting system, have this ballot be the official record of voter intent for the purpose of recounts and audits, and institute mandatory random sample audits of these paper records for all elections to detect machine failures (malicious and benign). These mandates and procedures will ensure that the voice of every voter is heard and safeguarded at every step of the election process.

Published by Waldo Jaquith

Waldo Jaquith (JAKE-with) is an open government technologist who lives near Char­lottes­­ville, VA, USA. more »

5 replies on ““Summary of Electronic Voting System Analysis.””

  1. I agree that Virginia’s (and most states’) system of election administration is currently completely incapable of giving voters any confidence whatsoever in stated outcomes of extremely close elections such as Deeds-McDonnell. But my hunch is that this major failure has more to do with the lack of uniformity in voting equipment across localities than it does with DRE malfunction. It’s a simple fact that different patterns in voter use of different equipment manifest themselves in most final vote tallies. It’s possible that when a certain locality’s voters are more easily able to click through all the touchscreens to get to the “cast vote” page (as opposed to filling in the scantron all on one page), the undervote rate increases for that locality. Perhaps there should be more testing of paperless machines, but variance in voter reaction to ballot formats strikes me as a more likely culprit.

    Either way, the consequence is that by the time “all precincts have reported,” the process of adding a statewide candidate’s vote totals from different localities together becomes much like adding apples and oranges. The fact that vote totals in extremely close races begin to border on irrelevance is what angers me most about election administration in VA and elsewhere.

    Alright, now on to my mini-expose of problems plaguing the Deeds-McDonnell “official results,” which I’ve compiled through neglecting studying for exams in favor of going on a scavenger hunt to find 323 votes for my main man Creigh. The joys of being a full-time partisan Dem.

    Presenting…
    IF JOE KEARFOTT & THE DEEDS LEGAL TEAM AREN’T ON THESE, THEY NEED TO BE:

    1. “The “most apathetic” optical scan locality is Fredericksburg City, in which 6.2% of voters cast no ballot for Lieutenant Governor or Attorney General.”

    Actually, this looks to be a mischaracterization. In City Ward 2 Precinct 1, the full SBE excel readout reports that 402 out of the precinct’s 804 registered voters voted in person on November 8th. But the total votes for each statewide race hovered just under 250, for an absurd undervote race in the 40%s, a strong statistical unlikelihood not matched anywhere else in the city. Creigh Deeds won this precinct 182-55 – if there were 100-150 more uncounted votes, what of them? It could also simply be that the number of total in-person voters for this precinct was misreported. Either way, something’s rotten in F’burg.

    2. At the Hodgesville voting precinct in SWVA’s Franklin County, there were 201 votes reported in the AG’s race: 106 for McD, 95 for Deeds. But in the Governor’s and LG’s races, there were only 172 and 166 total votes reported respectively (Kaine won this precinct 93-75). This overvote in the AG’s race is also a far-out statistical outlier that seems to have advantaged McDonnell, but it is also possible that somehow there were many uncounted votes in the top two races on the ballot. Definitely something wrong here.

    3. Roanoke County’s 9th District absentee precinct reported 83 votes for Kaine and 87 for Kilgore. But in the 9th, where Deeds tended to run much better than Kaine, the full SBE readout reports that Deeds lost the absentees there 66-105. While this is much more plausible than #2, it too represents a glaring aberrance from typical voting patterns in the region. We’re not talking about a lot of votes here, but suppose that as the result of a human error those numbers were simply transposed into the wrong columns: 255 is a relatively long way on the path from 323 to -1.

    The bottom line? Errors abound, and we’ll never really know who won on November 8th.

  2. I’m shocked to discover the low standards to which we hold for balloting. I wonder if the desire for rapid conclusion outstrips the desire for accurate results? I’m as eager as anyone to know who won on election day but I would gladly wait days or weeks if it meant that ballots were counted meticulously.

  3. Keep in mind, though, that the $100,000 in the Charlottesville 2007 budget is only for creating a paper audit, if the legislature mandates it. It’s not like the city intends to take the initiative here.

Comments are closed.