When King County Elections (KCE) reached out to us asking for support with usability testing in 7 languages, our first thought was: 7? That’s a lot of languages.
We’ve done multilingual usability testing before and have written a guide on how to conduct them. But we’ve never conducted a study with this many languages at once.
It was a new and exciting challenge. We planned for the test throughout 2024. Then, over a 2-week period in January 2025, we coached KCE through usability testing and synthesis.
Along the way, we learned that not only is testing in this many languages possible, it’s absolutely doable! It can also lead to valuable insights that make multilingual voting materials clearer and more effective.
We’re sharing this case study to give a behind-the-scenes look at the process. Whether you’re thinking about adding 1 more language, 5 more, or just getting started, we hope it helps you design and run your own multilingual testing.
Voters in Seattle will vote in their first ranked choice voting (RCV) primary in August 2027. Preparing for the upcoming change in voting method involved testing materials and ensuring voters could understand and trust them.
For KCE, preparing voters for their first RCV election wasn’t just about explaining a new system. It was also about doing so in all these languages and capturing the cultural context and nuance of each. That meant ensuring the voting material design was clear, trusted, and culturally appropriate in the most common languages their voters speak: Chinese, Korean, Russian, Somali, Spanish, Vietnamese, and English.
We worked with KCE to co-design a multilingual usability test. That involved designing the test, finding and training the moderators, and then working together on the research synthesis. Overall, this project had 3 phases:
The planning and preparation phase was essential to making sure the whole project ran as smoothly as it did. We worked with KCE to co-design the study and create guidelines for research, moderation, and usability testing.
Preparation included online training sessions with all 7 moderators, and all the members of KCE’s Language Access & Community Outreach team. The training served to get all the moderators on the same page and make sure they were prepared to dive into the usability tests. The training included helping the moderators translate and adapt all testing scripts, ballots, and election materials for each language, and ensuring the layout and design fit each script. We also created a moderator guide.
During training, we included a mock usability session and practiced note-taking as well as walking through the study’s objectives, so that all the moderators, whether or not they had prior usability testing experience, were on the same page.
KCE recruited participants in each language and provided a $50 gift card for their time.
The first part of testing involved independent sessions. Each moderator tested with at least 8 participants from their community.
Participants reviewed the ballot package, practiced voting in a mock RCV contest, and shared their thinking aloud.
Moderators documented key observations, sorted similar insights, and created “heat maps” of challenging areas in the materials.
During this period, we were available on Zoom for office hours to provide the team with constant support and answer any questions that might arise during testing and writing their in-language debriefs.
Image: Moderators, CCD, and KCE staff came together in person for cross-language synthesis
The next step was to synthesize all of the materials in a cross-language synthesis. Moderators, CCD, and KCE staff came together for an all-day in-person workshop to synthesize findings.
The workshop was divided into 2 parts.
Mapping themes: Moderators collected the themes they identified in each language and posted them on the wall. Then we grouped these themes into shared challenges and unique issues.
Heat map consolidation: The group merged language-specific heat maps into a single document, sparking discussions on design solutions that would work for everyone while respecting language differences.
A hard part of multilingual testing is making sure that problems found in 1 language don’t get lost. Every language has its own culture and nuance, so we wanted to be sure those details showed up in our final takeaways. Keeping track of these issues after the testing ended, and throughout the debrief and cross-language synthesis, was key to making sure nothing slipped through the cracks.
In our cross-language conversations, everyone spoke from their own language background and experience. That made it easier to identify which issues were truly language-specific and which applied across languages. Keeping language-specific issues and cross-language issues separate helped make sure we kept everything important during the synthesis.
Another challenge was not losing detail in the synthesis process. This was a large, complex project with many people involved in the research synthesis, and each team member had spoken to a lot of participants. We wanted to make sure our larger findings didn’t overshadow smaller but still critical findings.
When working in 7 languages, even if we spoke to 10 people in each, a single participant’s significant issue could still be meaningful. We had to design the process carefully so those individual but important details didn’t get lost as we moved toward broader insights. The brain-dumping process helped make sure nothing got lost.
Translation discussions added another layer of complexity. Conversations about translation in just 1 language can already be nuanced, but bringing in multiple languages can lead to surprising insights. We saw patterns emerge where a translation issue was unique to a single language, as well as instances where translation challenges were shared across multiple languages.
KCE now has a set of RCV materials that reflect the diverse needs of its community and a testing process for building trust through intentional, multilingual design. The partnership strengthened their voter education strategy and set a new bar for multilingual testing in voting materials.
Part of what made this collaboration successful was our ongoing partnership with KCE, which made it easier to cover new ground more quickly. In addition, KCE already had experience with usability testing and a robust language access team.
Based on this project, here are a few big takeaways:
Testing in multiple languages can lead to many insights that can improve your materials. For offices with multiple language requirements, we’ve laid out the process so that it’s easy to follow and to make sure all their questions get answered along the way.
Tracking changes in each language and then across languages helped identify trends and open up nuanced conversations about meaning in each language.
No finding is too small or insignificant during the “brain dump” phase. You’ll discover how ideas and findings intersect across languages during the cross-synthesis. Before that, it’s important to brain dump findings before you forget them. Sometimes the small details can lead to unexpected discoveries.
We’ve seen how effective usability testing can be for election materials, even for offices new to testing. Whether you’re a usability testing expert or are running a test for the first time, we’re excited to support you along the way!