Testing Medicaid AVR with data consent

In Colorado, less than five percent of Medicaid applicants who saw an explicit voter registration
question (i.e., Do you want to register to vote?) were opting in. We worked with CO to improve AVR rates.

Instead of asking people if they want to register, which can be a risky question to answer for people who aren’t eligible to register, we decided to take a different approach. We tested whether a consent model, where people are asked whether they consent to having their data shared with their election office for voting registration purposes, would increase opt-in rates.

We found that when we switched over to a consent model, half of the participants answered “yes” to the consent question — a huge improvement to the original opt-in rate.

Center for Civic Design Research Report - Updates from the front line of civic design research

Colorado Medicaid AVR with data consenting testing report

Download report

Key findings

Half of the participants answered yes to the consent question

Participants answered “yes” to the consent question at a much higher rate than Medicaid applicants
answer “yes” to the current explicit voter registration question. This indicates that the consent model is a
more effective way of satisfying Section 7 of NVRA.

Participants liked plain, specific, and precise language

We tested four different versions of the consent question. Each one had a different tone:

  • Inspirational
  • Impersonal
  • Informational
  • Invisible

We learned that participants always preferred plain language, but sometimes they wanted more details.

Where we put the consent question didn’t affect opt-in rates

Participants cited good reasons for both positions (the beginning and the end of the application):

“It’s better to be upfront about this”
“This would sum up the application better, at the end”

About this research

This research was conducted for the Institute for Responsive Government by Sean Johnson, Isabelle Yisak, and Evie Lacroix.

We held usability testing sessions with 24 participants in Denver and Aurora.

How we recruited:

  • We recruited both online (Craigslist) and in-person (intercepts).
  • We chose sites in zip codes of mostly low-to-middle income households.
  • All participants had applied for Medicaid before.

We tested a data consent model for AVR on the PEAK multibenefit application.

  • We created realistic mockups of the current online application. We had participants fill out the application on a laptop computer while we watched them and took notes.

The main questions that drove our research were:

  • Are Medicaid applicants more likely to answer “yes” to a consent question than they are to an explicit voter registration question?
  • What’s the best wording for the consent question?
  • Where should the consent question go in the application?

Related resources

Visit our page on voter registration to learn best practices for different registration policies including automatic voter registration, same-day voter registration and online voter registration.