Poll workers and election integrity

How do interactions between poll workers, technology, and other election materials support better elections?

Most research about security in elections has focused on security risks related to physical and software vulnerabilities of voting equipment. But none of that research focused in any meaningful way on what must happen in the interaction between voting systems and poll workers—possibly a major vulnerability of the overall voting system—to make poll workers truly effective in their vital role in administering elections.

We focused on the implications for election integrity of interactions between people and technology and materials rather than just possible vulnerabilities in the technology.

From November 2012 to November 2013, our team of researchers observed poll workers as they opened and closed their polling places for 19 elections in 12 states. These elections included the 2012 presidential elections and a variety of local elections. We chose the elections to include a variety of voting systems, types of elections, counting methods, and other local procedures.

Research in polling places 2012-2013

Map showing research locationsWe observed 19 elections in 12 states. Our selection was a purposeful sample to provide a range of different types of jurisdictions. We looked for polling places:

  • where we could observe poll workers working with a variety of voting systems from paper ballots to fully electronic systems
  • representing different areas of the country and a range of neighborhood settings from inner city to small cities, suburbs, and rural towns
  • with a variety of approaches to election administration and process

The selection was also a form of convenience sample, based on the election calendar for local elections and primaries in the spring of 2013 after initial observations in November 2012.

Project activity and approach

Our study used ethnographic techniques to systematically study election days from the point of view of the people who make them happen. The project included several phases of work:

  • Review. The researchers reviewed manuals and forms, and reviewed or attended poll worker training where we could. We wanted to get a measure of how much of the content for poll workers was related to security and what it covered
  • Observe. The centerpiece of the project was observing set up and shut down of the polls. Researchers watched without interfering with the poll workers at the polling place and (in a few cases) in training.
  • Writing up. We created a structured note-taking guide helping 17 researchers coordinate the activity and materials to observe through the day.
  • Interview. We conducted both informal discussions during the observation period and semi-structured follow-up interviews with election officials and poll workers.
  • Analyze.  We analyzed these field notes to reveal patterns and trends, which led to the insights in our report.
  • Collaborate. We asked the election officials who hosted the research teams to review the report and share their perspectives on how to support election integrity.


We thought that it was most likely that security problems for poll workers would be inadvertent, originating because of mistakes rather than as purposeful attacks. In answering our research questions, we learned

  • In most of the places we studied, poll workers have, and use, procedures designed for security. Security is not a separate layer consciously, explicitly carried out. Election officials approach security as an part of elections and attempt to design election procedures to support trust in the election. Poll workers use those procedures to their best ability.
  • Even within some of the jurisdictions that seemed the most thorough and organized, some procedures don’t make sense, or aren’t complete, accurate, or clear. So poll workers generally try to do their best. They rationalize and improvise, which usually results in a good or improved result.
  • The security vulnerabilities are distributed among people, processes, paper, and procedures and training. However, the issues around reconciling after closing the polls deserve specific attention.
  • No jurisdiction was perfect but we did not see a threat to the integrity of an election because of tools or aids for poll workers. The most problematic activity of the day, across jurisdictions, polling places, and precincts was reconciliation at the end of the day.
  • Current research, including research on instructions for poll workers at NIST, EAC Quick Start Guides, and The Field Guides To Ensuring Voter Intent already address some of the issues we observed and would also be helpful in improving training, manuals, and Election Day task support.

We also learned that election days can be chaotic, with many stress points and that planning for security must take this into account. You don’t deploy over a million temporary workers and not get some variation in their diligence and effectiveness.

Here are some of the other highlights from the study:

Security defined from the point of view of poll workers goes beyond “chain of custody.” Security in elections is the processes, procedures, tools, and people put in place to ensure that elections run freely, fairly, and efficiently.

Attitudes. There were four broad classifications of attitudes among the poll workers we met, from a shallower to deeper sense of ownership of the polling place. These attitudes were influenced by many factors: personal history, election culture, voting equipment and how long it had been in service, who managed the team, local policies, leadership of the election director or clerk, and changes in laws.

Supporting poll workers the Goldilocks way. But jurisdictions face what we call the “Goldilocks problem” of finding a balance in how much training and paperwork to give poll workers to support them in their work. The amount of paperwork associated with an election—in addition to ballots and tallies—would surprise many people. Starting with poll worker manuals—which can have 200 to 400 pages—given out at training, and ending with reconciliation sheets and equipment and supply inventory sheets, the checklists and documentation of elections can generate reams of paper per poll worker.

Or they might not. Some jurisdictions we studied took a minimalist approach. Poll workers got a 100-page manual, a dozen forms for documenting tallies and incidents, a poll book, and the phone number for the elections office.

Some poll workers had to guess or make a lot of calls because manuals were lacking or checklists didn’t exist. Other poll workers spent as much time sorting through the paperwork that was supposed to be helping them as they did doing the work the checklists were meant to support.

Stress points and security vulnerabilities. Election administrators pay attention to the “stress points” in polling places. After each election, they review the feedback from poll workers, clerks, and staff to see where there were problems. It’s a program of constant, incremental improvement. All of the jurisdictions we worked with mentioned similar steps. Every stress point is also a potential security vulnerability. We observed these stress points:

  • Before Election Day
    • Delivering election materials to the polling place
  • Opening the polling place
    • The degree to which work in the polling place is organized or self-organizing
    • Inventorying ballots and other materials
    • Coping with the early start
  • During Election Day
    • Managing traffic flow
    • Documenting and troubleshooting incidents and exceptions
  • Closing the polling place and packing up
    • Inventorying ballots and other materials
    • Reconciling counts from the poll book, ballots, and voting systems
    • Organizing, sorting and packing up
    • Managing the work assignments and tasks
    • Coping with exhaustion and the urgency to finish the day and post results
  • Delivering the results
    • Checking in with the election office from the polling place
    • Returning materials to the election office


We are grateful to the many people who made this project possible. First, the election officials who hosted our researchers, graciously giving us time and allowing us to see how they prepare for and run polling places on Election Day.

Our team of researchers included user experience professionals around the country with an interest in civic design. They put in the long hours of an election and wrote up their field notes for our analysis. Collaborating researchers  allowed us to visit more polling places: Emily Barabas, Lynn Baumeister, Rachel Goddard, Kelsey Lim, Karen Lin, Keela Potter, and Josie Scott.

We also worked with University of Minnesota graduate students from Doug Chapin’s class at the Humphrey School: Aaron Rosenthal, Ashley English, Christina Farhart, Hunter Gordon, Julie Koeheler, Paul Linnell, Peter Polga-Hecimovich.

Thank you all.