menu 1
menu 2
menu 3
menu 4
   

dg.o Web

Building a Better Voting Machine
DG Team Assesses the State of Voting Technology Methods for More Reliable Elections
By
For the DGRC

Assessing Ballot Tech
 

Researcher profile: Paul S. Herrnson

Project Profile:
An Assessment of Voting Technology and Ballot Design

Home: http://www.capc.umd.edu/rpts/VotingTech_par.html




Federalism is a remarkable balance of competing interests. It has kept our republic together for over two centuries. On a philosophical level it can be as strong and resilient as spider silk. Unfortunately, on a pragmatic level, it can be just as sticky and messy.

Consider voting - it would seem we could answer all the legitimate concerns and even the most paranoid conspiracy theories about fair and honest elections, if we'd simply decide on one voting system and style. But the very reason we've succeeded so well as a democracy is the very reason we can't standardize on a way to vote: Our federal system does not permit the mandating of one, nationwide system.

Indeed, if ever there were a dream project for a Digital Government researcher, it would be to look at something as fundamental as using technology to ensure full enfranchisement.

Paul S. Herrnson, director of the Center for American Politics and Citizenship at the University of Maryland, College Park are doing just that. They are testing several voting machines and various styles of ballots to see which ones people find most comfortable - and which ones most accurately record their votes. Their work ("An Assessment of Voting Technology and Ballot Design")will be used by the National Institute of Standards and Technology (NIST) to help draw up voting system guidelines under HAVA, the Help America Vote Act.

Florida's infamous butterfly ballot seems less of an anomaly when you consider that for years many jurisdictions used mechanical voting machines that dated to before the turn of the last century. They look great in Norman Rockwell paintings, but they often had candidates' names or ballot questions that were placed too high up for many people to read. And they had stiff levers that were hard for elderly or disabled voters to move. Too often, throughout our history, voting machines were the best that equipment manufacturers at the time could design, with the emphasis on quick and accurate tabulation rather than whether citizens actually found it easy to use the machines. Imagine if car manufacturers concentrated on making cars go fast, without regard to whether or not motorists could find the steering wheel.

HAVA, passed after the 2000 election, does not - and cannot - specify one standard voting system. It does however authorize NIST to create a set of guidelines that vendors must use in creating voting equipment. NIST wants to make sure the machines are usable and accessible across the entire spectrum of the US voting age population. Nevertheless, states are still free to buy whatever equipment they prefer. However, they will lose the carrot of federal funding if they purchase machines that do not meet the federal guidelines.

NIST is not a Consumer Reports-type testing service, explains Sharon Laskowski, Computer Scientist and Group Manager in the Information Access and User Interfaces Division of the Information Technology Laboratory at NIST. They do not put pieces of equipment up against in each other in head-to-head competition, but instead try to establish reasonable requirements and benchmarks that any well-working piece of equipment should be able to achieve. "We're looking at current best practice in user-centered design and usability testing," says Laskowski, "We're very excited about putting in usability standards for voting."

This is where Herrnson's work comes in. He and his team recruited 1,536 participants in three states, including many elderly voters on the assumption that they might be most challenged by electronic voting machines. Participants were asked to go through the voting process on six different machines. (For some elderly participants, it was limited to four machines.) Ballot styles were varied to simulate the types of ballots voters encounter in different states, but for the most part individuals each voted on one style of ballot.

They were given instructions on how to vote - including writing in candidates, and changing their choices - so that the researchers could see how people from diverse socio-economic backgrounds handled the same task on each machine. Then the participants were surveyed on their experiences. According to their comments, the very same action, such as a write-in vote, could be simple or frustrating depending on the machine used.

Latest DG News

" dg.o2005 Convenes May 16-18 in Atlanta
" dg.o2005 Issues Call for Papers
" IEEE ISI Issues Call for Papers
" DG Study Calibrates Mediation
" Swapping Secrets of the Double Helix
" FirstGov BUZZ Reports on Tsunami Relief Work
" UK and DO-Wire Launch e-Gov Best Practices wiki
" DG Team Develops "Virtual Agora" for e-Gov
" Mapping for Times of Crisis
" Exploring Detection of Crisis Hotspots
" Report: Mass eMail Campaigns Harmful
" Scenario-Based Designs for Stat Studies
" US, EU Explore Info Integration
" DG Team Develops Digital Interpreter
" DG Study Gives Teeth to FBI
" Research Smooths Road for Small Businesses
" DG Researchers Parsing in Tongues
" e-Gov Journal Issus Call for Articles

" See all news stories

Contribute to dgOnline

Here's the surprise of it: Laskowski says that this is the first time voting systems have been usability-tested with large numbers of voters actually trying to vote. The field of software usability engineering itself, she points out, is actually quite new - barely more than 15 years old. The idea of a voter as an end user whose comfort and confidence must be taken into consideration is a novel one that only came to the fore in the contested 2000 presidential election.

Now vendors offer over two dozen different kinds of machines, and states use several different styles of ballot. Herrnson's group is testing a range of equipment from ATM-style touchscreen electronic machines to a machine with a dial to their own custom prototype. Their prototype has an interactive "zoomable" interface, which, according to Herrnson, "enables the voter to navigate freely between an overview of the entire ballot and the details of a specific race. For example, if the voter touches the box on the screen titled U.S. Senate, then the screen that lists the candidates for U.S. Senate will 'zoom' into view."

"Some machines definitely evoked a more negative response than others," says Herrnson, who is still tabulating results. Favored machines generally had touchscreens that allowed voters more control when navigating through the ballot. One of the surprises to come out of the preliminary results is that despite all the emphasis on having paper trails for electronic voting machines, the test-voters' confidence levels were not affected by the presence or absence of a paper receipt, according to Herrnson.

Herrnson says that one of the most important results is still being pulled together: the correlation between an individual's assessment of a machine and whether his or her votes were accurately cast. "We're very interested in Paul's results and his experimental design," says Laskowski. "Usability and accessibility are a critical a part of getting voting systems to capture the voter's intent. They're just as important as security, reliability, transparency and auditing."

But as for the idea that has been floated that for the good of the nation, all states and all jurisdictions should agree on one standard machine and style of ballot, Herrnson sighs, "There's also talk about sending people to Jupiter."