The 80th Annual AAPOR Conference was an amazing opportunity to learn and connect, as well as share some findings from our recent work. The SSRS team presented over 20 papers and 2 posters on a wide range of topics. We’ve gathered the key takeaways from select sessions to share with those not able to attend the event.
2025 SSRS AAPOR Presentations: Key Takeaways - Day 1 & 2
Explore the findings for each presentation by clicking on the titles going down the left side of the page to expand the details.
Can Visible "Cash" be a QR Code? Presenting the Results of an ABS Experiment
The first method is the standard visible cash pre-incentive. The second, which we tested in our experiment, is an innovative approach: instead of including visible cash in the envelope, we printed a QR code with “$5” in the center. We offered $5 to anyone who scanned the QR code, entered an access code, and provided their email address.
- Visible cash outperforms the QR code by generating a higher response rate, higher panel join rate, and faster response speed.
- The QR code approach offers considerable cost savings—even under the assumption that we mail the same number of envelopes.
- The QR code method proved more effective at reaching younger individuals, non-White participants, lower-income respondents, those living in urban or suburban areas, and individuals more likely to report a party affiliation.
How (and Why) U.S. Adults Are Using Large Language Models (LLMs)
- The internet age has brought three prior revolutions in how people connect with one another and with information and content – broadband, mobile (smartphones), and social media. Each had profound effects on survey research, moving us from RDD landline methods to dual frames, online surveys, mobile optimization, SMS surveys, and online panels.
- As our recent survey with Elon university’s Center for Imagining the Digital Future confirmed, consumer AI is the fourth revolution. With 52% of U.S. adults using LLM assistants like ChatGpt, Claude, Copilot and Gemini, including one in three U.S. adults using them daily, it’s imperative for survey researchers to figure out ways to meet respondents where they are (and where they will be) in this new AI-assisted world.
- Two initial shifts to consider are 1) offering specially trained AI assistants in survey platforms to better manage when/how respondents use AI during the survey experience and to capture use as survey metadata, and 2) leaning into AI-enabled interviews for CATI, text and even web to cater to our respondents, many of whom regularly engage in human-like voice conversations with their AI assistants.
Hey, Heads Up: Examining The Impact of Using SMS Pre-notifications to Improve Response Rates on Probability-panel Surveys
- While not every survey is suitable for SMS or even SMS push-to-web invitations, we can still use SMS as a powerful tool to reach panelists where they want to be reached. One of the pros of working with a probability panel like the SSRS Opinion Panel is that we can communicate with panelists vis SMS outside of standard invites/reminders at little to no additional costs.
- The SMS Pre-Notification is a tool that notifies panelists of an upcoming survey. This way they are primed/expecting a survey invitation rather than receiving one out of the blue.
- While the SMS Pre-Notification does not increase the overall response rate for a given study, it does increase the response rate for harder to reach populations and demographics such as Black panelists and Parents. Moreover, the pre-notification encourages response from panelists with different attitudes and behavioral characteristics within the survey.
Can Visible "Cash" be a QR Code? Presenting the Results of an ABS Experiment
The first method is the standard visible cash pre-incentive. The second, which we tested in our experiment, is an innovative approach: instead of including visible cash in the envelope, we printed a QR code with “$5” in the center. We offered $5 to anyone who scanned the QR code, entered an access code, and provided their email address.
- Visible cash outperforms the QR code by generating a higher response rate, higher panel join rate, and faster response speed.
- The QR code approach offers considerable cost savings—even under the assumption that we mail the same number of envelopes.
- The QR code method proved more effective at reaching younger individuals, non-White participants, lower-income respondents, those living in urban or suburban areas, and individuals more likely to report a party affiliation.
How (and Why) U.S. Adults Are Using Large Language Models (LLMs)
- The internet age has brought three prior revolutions in how people connect with one another and with information and content – broadband, mobile (smartphones), and social media. Each had profound effects on survey research, moving us from RDD landline methods to dual frames, online surveys, mobile optimization, SMS surveys, and online panels.
- As our recent survey with Elon university’s Center for Imagining the Digital Future confirmed, consumer AI is the fourth revolution. With 52% of U.S. adults using LLM assistants like ChatGpt, Claude, Copilot and Gemini, including one in three U.S. adults using them daily, it’s imperative for survey researchers to figure out ways to meet respondents where they are (and where they will be) in this new AI-assisted world.
- Two initial shifts to consider are 1) offering specially trained AI assistants in survey platforms to better manage when/how respondents use AI during the survey experience and to capture use as survey metadata, and 2) leaning into AI-enabled interviews for CATI, text and even web to cater to our respondents, many of whom regularly engage in human-like voice conversations with their AI assistants.
Hey, Heads Up: Examining The Impact of Using SMS Pre-notifications to Improve Response Rates on Probability-panel Surveys
- While not every survey is suitable for SMS or even SMS push-to-web invitations, we can still use SMS as a powerful tool to reach panelists where they want to be reached. One of the pros of working with a probability panel like the SSRS Opinion Panel is that we can communicate with panelists vis SMS outside of standard invites/reminders at little to no additional costs.
- The SMS Pre-Notification is a tool that notifies panelists of an upcoming survey. This way they are primed/expecting a survey invitation rather than receiving one out of the blue.
- While the SMS Pre-Notification does not increase the overall response rate for a given study, it does increase the response rate for harder to reach populations and demographics such as Black panelists and Parents. Moreover, the pre-notification encourages response from panelists with different attitudes and behavioral characteristics within the survey.
Key Takeaways: Day 3 - Part 1
Explore the findings for each presentation by clicking on the titles going down the left side of the page to expand the details.
- Beyond "Liberal”, “Conservative" and “Moderate” - Improving Self-Identification Questions to Capture a Nuanced Ideology
- Refining Insights: Leveraging Iterative Qualitative Research to Explore Fears and Uncertainties for People with Parkinson’s Disease
- Deciders in Focus: A Novel Approach to Gathering Real-Time Presidential Debate Reactions
- The Importance of Socio Economic and Geographic Dimensions in International Non-Probability Sample Calibration
- Recontacting Registered Voters Over Time: How Does It Impact Pre-Election Projections?
Beyond "Liberal”, “Conservative" and “Moderate” - Improving Self-Identification Questions to Capture a Nuanced Ideology
- Self-identified political ideology is usually measured on a linear Liberal/Moderate/Conservative scale. However, we worry the question is too restrictive because some respondents with less common or more nuanced political beliefs may not identify with any of those terms or may identify with more than one of them.
- SSRS tested an alternative version of the question which allowed respondents to pick as many options as they wanted from a longer list of ideologies (including niche options like populist, progressive, socialist, etc.).
- Through quantitative and qualitative testing, we ultimately found this new question posed the risk of too much measurement error because many respondents didn’t have a firm grasp on the definitions of the less common ideologies.
- Although the traditional Liberal/Moderate/Conservative scale is restrictive, our analysis found that respondent understood the scale well and knew where to place themselves even if they didn’t truly identify with the term (e.g., “progressive” or “socialist” respondents will still pick “liberal” if its their only option).
Refining Insights: Leveraging Iterative Qualitative Research to Explore Fears and Uncertainties for People with Parkinson’s Disease
- The flexibility of qualitative methods allows researchers to learn from the initial stages of their study and modify the research protocol to maximize insights.
- Being poised to pivot, amidst data collection, is particularly valuable when conducting research with limited budgets and low incidence populations
- When data saturation is reached, an iterative process of a multi-phased research design can be effective and efficient in deepening understanding of complex concepts and informing the development of a novel survey instrument.
Deciders in Focus: A Novel Approach to Gathering Real-Time Presidential Debate Reactions
- Darby Steiger presented this paper, which was co-authored by Kristen Conrad and Dave Rodbart, as well as Scott Clement and Emily Guskin from The Washington Post, about our recent real-time event board that we conducted during the 2024 Harris-Trump Presidential Debate.
- SSRS designed this novel methodology to allow us to collect qualitative feedback in real time from a group of undecided voters, providing The Washington Post with instant access to the data so that they could publish results immediately throughout the debate.
- This methodology offers many opportunities for other applications, such as with the State of the Union address, sporting events, awards shows, and other types of events.
The Importance of Socio Economic and Geographic Dimensions in International Non-Probability Sample Calibration
In the umbrella of opt-in (convenience samples):
- Inclusion of Geographic Region with Sex and Age does little in terms of reducing bias across countries
- Education has a greater impact in reducing bias in conjunction with Sex by Age
- If demographically representative targets are unachievable in certain countries, then post-calibration is a valuable tool in reducing non-response bias and producing a more demographically representative sample
Recontacting Registered Voters Over Time: How Does It Impact Pre-Election Projections?
Overview: Using a methodological sample design where the second wave of a study combined Fresh RBS (Registration-Based Sample) with recontact sample (i.e., RBS sample completed interviews from a first wave with respondents who agreed to be recontacted) in seven swing states to survey registered voters, this presentation sought to help determine the effectiveness of recontacts over time and how different propensity levels across voter groups may impact overall results.
The study was conducted in 3 waves for The Washington Post: Wave 1 – April-May, 2024 with all Fresh RBS sample; Wave 2 – September-October, 2024, with a combination of Fresh RBS and Recontact Sample, and Wave 3 – December, 2024, with all Recontact Sample. Analysis focused on Waves 1 and 2.
Key Takeaways:
- When looking at change in vote choice from Wave 1 and 2 (i.e., switched who they said they were going to vote for from Wave 1 to Wave 2), both the Fresh RBS sample and the Recontact sample showed similar patterns. Additionally, looking at Wave 2 only, the Trump vote share (i.e., percentage of people saying they would vote for Trump) was comparable for the Fresh RBS/Recontact (this study design) sample versus the Fresh RBS alone (had the Recontacts not been part of the sample). Since the two samples show minimal difference in outcome, using recontact sample can provide a potential means for both tracking respondents over time and saving cost.
- The Recontact sample showed similar patterns from Wave 1 to Wave 2 in recontact rates by the respondent’s vote preference (i.e., Results show a clear pattern of stronger Trump support across all Registered Voters correlating with lower response rates). Thus, non-ignorable response exists in both Fresh RBS and Recontact sample. Harder to reach/lower propensity groups should be pursued regardless of sampling option.
- Some results (such as state level analysis vs across states) show more volatility, but this may be due to smaller sample sizes. Be mindful of sample sizes and the contribution of each component if blending Recontact with Fresh RBS sample, especially at the state level.
Beyond "Liberal”, “Conservative" and “Moderate” - Improving Self-Identification Questions to Capture a Nuanced Ideology
- Self-identified political ideology is usually measured on a linear Liberal/Moderate/Conservative scale. However, we worry the question is too restrictive because some respondents with less common or more nuanced political beliefs may not identify with any of those terms or may identify with more than one of them.
- SSRS tested an alternative version of the question which allowed respondents to pick as many options as they wanted from a longer list of ideologies (including niche options like populist, progressive, socialist, etc.).
- Through quantitative and qualitative testing, we ultimately found this new question posed the risk of too much measurement error because many respondents didn’t have a firm grasp on the definitions of the less common ideologies.
- Although the traditional Liberal/Moderate/Conservative scale is restrictive, our analysis found that respondent understood the scale well and knew where to place themselves even if they didn’t truly identify with the term (e.g., “progressive” or “socialist” respondents will still pick “liberal” if its their only option).
Refining Insights: Leveraging Iterative Qualitative Research to Explore Fears and Uncertainties for People with Parkinson’s Disease
- The flexibility of qualitative methods allows researchers to learn from the initial stages of their study and modify the research protocol to maximize insights.
- Being poised to pivot, amidst data collection, is particularly valuable when conducting research with limited budgets and low incidence populations
- When data saturation is reached, an iterative process of a multi-phased research design can be effective and efficient in deepening understanding of complex concepts and informing the development of a novel survey instrument.
Deciders in Focus: A Novel Approach to Gathering Real-Time Presidential Debate Reactions
- Darby Steiger presented this paper, which was co-authored by Kristen Conrad and Dave Rodbart, as well as Scott Clement and Emily Guskin from The Washington Post, about our recent real-time event board that we conducted during the 2024 Harris-Trump Presidential Debate.
- SSRS designed this novel methodology to allow us to collect qualitative feedback in real time from a group of undecided voters, providing The Washington Post with instant access to the data so that they could publish results immediately throughout the debate.
- This methodology offers many opportunities for other applications, such as with the State of the Union address, sporting events, awards shows, and other types of events.
The Importance of Socio Economic and Geographic Dimensions in International Non-Probability Sample Calibration
In the umbrella of opt-in (convenience samples):
- Inclusion of Geographic Region with Sex and Age does little in terms of reducing bias across countries
- Education has a greater impact in reducing bias in conjunction with Sex by Age
- If demographically representative targets are unachievable in certain countries, then post-calibration is a valuable tool in reducing non-response bias and producing a more demographically representative sample
Recontacting Registered Voters Over Time: How Does It Impact Pre-Election Projections?
Overview: Using a methodological sample design where the second wave of a study combined Fresh RBS (Registration-Based Sample) with recontact sample (i.e., RBS sample completed interviews from a first wave with respondents who agreed to be recontacted) in seven swing states to survey registered voters, this presentation sought to help determine the effectiveness of recontacts over time and how different propensity levels across voter groups may impact overall results.
The study was conducted in 3 waves for The Washington Post: Wave 1 – April-May, 2024 with all Fresh RBS sample; Wave 2 – September-October, 2024, with a combination of Fresh RBS and Recontact Sample, and Wave 3 – December, 2024, with all Recontact Sample. Analysis focused on Waves 1 and 2.
Key Takeaways:
- When looking at change in vote choice from Wave 1 and 2 (i.e., switched who they said they were going to vote for from Wave 1 to Wave 2), both the Fresh RBS sample and the Recontact sample showed similar patterns. Additionally, looking at Wave 2 only, the Trump vote share (i.e., percentage of people saying they would vote for Trump) was comparable for the Fresh RBS/Recontact (this study design) sample versus the Fresh RBS alone (had the Recontacts not been part of the sample). Since the two samples show minimal difference in outcome, using recontact sample can provide a potential means for both tracking respondents over time and saving cost.
- The Recontact sample showed similar patterns from Wave 1 to Wave 2 in recontact rates by the respondent’s vote preference (i.e., Results show a clear pattern of stronger Trump support across all Registered Voters correlating with lower response rates). Thus, non-ignorable response exists in both Fresh RBS and Recontact sample. Harder to reach/lower propensity groups should be pursued regardless of sampling option.
- Some results (such as state level analysis vs across states) show more volatility, but this may be due to smaller sample sizes. Be mindful of sample sizes and the contribution of each component if blending Recontact with Fresh RBS sample, especially at the state level.
Key Takeaways: Day 3 - Part 2
Explore the findings for each presentation by clicking on the titles going down the left side of the page to expand the details.
- Effects of a Pre- and Post-Incentive Experiment on Response Rates in a Statewide ABS Household Enumeration Survey and Subsequent Enrollment in the Full Study
- Shedding Some Light on Panel Conditioning: Trends in Response Behavior and Data Quality in a National Probability Panel
- Assessing the Performance of Probabilistic Likely Voter Models in the 2024 Election
- The Effectiveness of Custom Calibration in Combining Mixed-Probability Samples
Effects of a Pre- and Post-Incentive Experiment on Response Rates in a Statewide ABS Household Enumeration Survey and Subsequent Enrollment in the Full Study
- Response rates for this household enumeration study improved after introducing a visible non-contingent cash incentive in the initial mailing; adding a pre-incentive was more effective in boosting response than offering a $5 or $10 contingent post-incentive alone. This is in line with prior research, and is interesting in that it demonstrates pre-incentives are effective at encouraging responses even when the survey in question asks for sensitive information (demographic and contact information) about members of the respondent’s household.
- Adding a pre-incentive was associated with boosted response rates AND with key demographics (race/ethnicity) of responding households being closer to population parameters than non-responders.
- While adding a pre-incentive was associated with certain key demographics of those households being closer to population parameters (more white households responding), the pre-incentive was also associated with a lower percentage of certain oversampled, hard-to-reach groups (i.e., minority households) being less likely to respond than when a pre-incentive was not included.
Shedding Some Light on Panel Conditioning: Trends in Response Behavior and Data Quality in a National Probability Panel
- We looked for evidence of panel conditioning—defined as changes in behavior or attitudes due to repeated exposure to surveys—among members of the SSRS Opinion Panel, a nationally representative probability panel of U.S. adults.
- We saw some evidence of conditioning with respect to survey-taking behavior itself, but these effects were largely benign or positive—for example, panelists who take more surveys become slightly less likely to use mobile devices, and slightly more satisfied with their experience on the Panel.
- We did not see strong evidence of conditioning effects with respect to substantive outcomes such as partisanship, ideology, or voter registration status, though more research is needed with a wider range of outcomes.
Assessing the Performance of Probabilistic Likely Voter Models in the 2024 Election
- Probabilistic likely voter models, trained on the voter file, predicted turnout to the 2024 election with a high degree of accuracy.
- In states with significant polling error, the error seems to have been driven primarily by nonresponse—specifically, lower response rates among voters modeled to have a moderate probability of supporting Trump—rather than likely voter modeling.
- In election polls that use registration-based samples, it may be possible to reduce polling error by calibrating to voter-file-based benchmarks for the interaction between modeled turnout probability and modeled vote preference.
The Effectiveness of Custom Calibration in Combining Mixed-Probability Samples
- Thoughtful weighting procedures are needed when combining probability and non-probability based samples to consider factors beyond demographics
- Adjustments using Encipher, based on non-demographic calibration, can provide stable estimates for small populations
- Estimates are similar to other probability-only sources
Effects of a Pre- and Post-Incentive Experiment on Response Rates in a Statewide ABS Household Enumeration Survey and Subsequent Enrollment in the Full Study
- Response rates for this household enumeration study improved after introducing a visible non-contingent cash incentive in the initial mailing; adding a pre-incentive was more effective in boosting response than offering a $5 or $10 contingent post-incentive alone. This is in line with prior research, and is interesting in that it demonstrates pre-incentives are effective at encouraging responses even when the survey in question asks for sensitive information (demographic and contact information) about members of the respondent’s household.
- Adding a pre-incentive was associated with boosted response rates AND with key demographics (race/ethnicity) of responding households being closer to population parameters than non-responders.
- While adding a pre-incentive was associated with certain key demographics of those households being closer to population parameters (more white households responding), the pre-incentive was also associated with a lower percentage of certain oversampled, hard-to-reach groups (i.e., minority households) being less likely to respond than when a pre-incentive was not included.
Shedding Some Light on Panel Conditioning: Trends in Response Behavior and Data Quality in a National Probability Panel
- We looked for evidence of panel conditioning—defined as changes in behavior or attitudes due to repeated exposure to surveys—among members of the SSRS Opinion Panel, a nationally representative probability panel of U.S. adults.
- We saw some evidence of conditioning with respect to survey-taking behavior itself, but these effects were largely benign or positive—for example, panelists who take more surveys become slightly less likely to use mobile devices, and slightly more satisfied with their experience on the Panel.
- We did not see strong evidence of conditioning effects with respect to substantive outcomes such as partisanship, ideology, or voter registration status, though more research is needed with a wider range of outcomes.
Assessing the Performance of Probabilistic Likely Voter Models in the 2024 Election
- Probabilistic likely voter models, trained on the voter file, predicted turnout to the 2024 election with a high degree of accuracy.
- In states with significant polling error, the error seems to have been driven primarily by nonresponse—specifically, lower response rates among voters modeled to have a moderate probability of supporting Trump—rather than likely voter modeling.
- In election polls that use registration-based samples, it may be possible to reduce polling error by calibrating to voter-file-based benchmarks for the interaction between modeled turnout probability and modeled vote preference.
The Effectiveness of Custom Calibration in Combining Mixed-Probability Samples
- Thoughtful weighting procedures are needed when combining probability and non-probability based samples to consider factors beyond demographics
- Adjustments using Encipher, based on non-demographic calibration, can provide stable estimates for small populations
- Estimates are similar to other probability-only sources

POSTER: Understanding Uninsured Respondents: Insights from State Health Surveys in Oregon, Minnesota, and Massachusetts
- The incidence of uninsured is higher on a prepaid cell phone frame than an address-based sample (ABS) frame.
- Within the ABS frame, the incidence of uninsured is higher among those using the inbound telephone mode than those completing on the web.
- Although demographic trends in uninsurance rates are fairly consistent across states, there are important differences in orders of magnitude that are worthy of attention when planning sample designs.

POSTER: Penny for your thoughts? Incentive Structure and Unexpected Changes to ABS Yield during the Oregon Health Insurance Survey
- Between the 2023 and 2024 cycles of OHIS, SSRS had to switch from a $1 visible cash pre-incentive to a $10 conditional post-incentive. Despite this change, which we anticipated would hurt survey yield in 2024, we actually saw a dramatic improvement in the amount of completes yielded per piece of sample.
- While there was no experimental data to analyze, a process analysis showed that other than the change in incentive structure, there were no other major changes made to the survey processes or protocol that could account for this change.
A descriptive analysis of the demographic and geographic data showed that the improved responsiveness did not alter the distribution of important demographic or regional traits in the data. - While we can’t say for certain that the incentive changes were involved in the yield change, we suspect they are related and call for replication in a controlled experiment, potentially paying attention to survey region if done outside of Oregon.