Background
SSRS conducted a novel qualitative study for the Washington Post-Schar School 2024 Deciders Study to gather real-time qualitative feedback during the only 2024 presidential debate held between Donald Trump and Kamala Harris. The goal of the study was to capture immediate feedback on the candidates’ statements throughout the debate from a large and diverse group of potential voters who had not yet solidified their vote preference and for the Washington Post to be able to publish the results instantly.
The Challenge
In designing the methodological approach, SSRS found that the traditional qualitative toolkit did not seem to meet the needs of the study, including allowing for nearly real-time publication of findings per the client’s request.
A live focus group would have had several disadvantages, including:
- A small group size
- The innate disruption of having to interrupt the debate to ask participants to share their reactions
- The inability to collect reactions from every participant to every probe, and
- The challenges of providing the client with instant access to the data.
A dial test would have been able to capture reactions in a numerical format (for example, capturing moment-by-moment reactions on a 100-point scale) but would have only allowed SSRS to gather the reasons behind their reactions in a post-survey focus group, at which point participants may have forgotten why they felt that way in the moment.
And an asynchronous board would have required pre-set questions, which would not have allowed SSRS to probe into what was being discussed during the debate.
The Solution
The SSRS solution involved using a novel methodology that allowed larger-scale qualitative insights in real-time. Using Sago’s Crowd Survey platform, SSRS posed tailored questions that were created in-the-moment, allowing SSRS to gather simultaneous feedback from dozens of prescreened potential voters and to view and download responses in real-time.
In practice, this meant that as soon as the candidates finished discussing a particular issue, SSRS could launch a question on the platform asking which candidate made better arguments about that specific issue and explain why they chose their answer. Similarly, SSRS could ask for reactions to a specific statement made by a candidate, such as whether the statement made sense (and why) or whether they agreed or disagreed with the statement (and why).
Participants remained logged into the platform throughout the entire debate, and each time a question was posed, SSRS typically received 100% response within about 8 minutes. Data were available to the Washington Post team for immediate download, displaying the responses along with participants’ demographic characteristics.
Since this is a new methodology, one of the biggest challenges was ensuring that participants understood how to log onto the platform and what would be happening throughout the evening. To address this, SSRS designed a communication “campaign” in the days leading up to the debate, sending daily emails and texts to participants to give them basic tips about what to expect. When the participants arrived on the platform a few minutes before the start of the debate, this information was repeated to them on the landing page of the platform. This active communication was helpful for building rapport between the research team and the participants and contributed to a 100% show rate and a 96% completion rate.