Designing an accessible RCV ballot

We used the Anywhere Ballot interface as the basis for research into how to support voters who are blind or very low vision, have limited or no use of their hands, or have cognitive or attention disabilities. Some of the best practices we discovered from this research include giving voters control of all interactions, creating a structure for efficient listening and presenting candidates in ranked order on the review screen.

Center for Civic Design Research Report - Updates from the front line of civic design research

Designing an accessible RCV ballot

Download report

Key findings

Some of the best practices we learned include:

  • Give voters control of all interactions. Any sense that the system is making decisions made participants trust it less. In the prototype, voters select candidates in the order they want to rank them. At any point, they can put the list of candidates into a ranked order and edit the rankings. This simple interaction worked well to help voters discover how ranking works and encouraged many to rank more candidates. Make controls visible for accessibility and to show what actions are possible.
  • Create a structure for efficient listening. Consistency and clear cues helped give participants the confidence to move quickly through the ballot. Use a consistent syntax of “To [do something] press [key name]”, keeping instructions in the same order and using pauses to separate chunks of information.
  • Present candidates in ranked order on the review screen. Include a prompt for the number of unranked candidates.

Putting the research to work

We created a best practice guide to accessible Ranked Choice Voting Ballots based on this research.

About the research

The research participants included 15 participants including voters with no use of their hands, autism or other attention and cognitive disabilities, and 6 blind voters who used an interactive prototype. We started from the ballot interface in the ElectionGuard Github repository, adding a new contest type and review-screen display. We used a static mockup for the printed ballot.

The work on the audio format was challenging because we wanted to be able to experiment with different phrasing fluidly, even trying alternatives during a session. To do this, we borrowed a method from the Los Angeles County VSAP research team using a human to be the voice of the voting system. The participant listened to the audio and simply spoke the name of the button they would press on the keypad. One of the researchers “drove” the interface (following the particpant’s instructions) so the moderator and observer could follow the interaction. The participant could interrupt with an instruction, just like they were pressing a button. It worked so well that one participant did not realize the audio was not digital.

Materials from testing:

Watch a clip from the testing session to see how audio ballots work:

This research was conducted by Lynn Baumeister, Alex Haraseyko, Whitney Quesenbery.

Related resources

Visit our page on ranked choice voting to find more resources for designing ballots, voter education, and election results for ranked choice.