The Current Innovations in Probability-based Household Internet Panel Research (CIPHER) Conference brought together researchers and policymakers from the United States and beyond for a wide-ranging conversation about innovations, challenges, and opportunities in this field.  SSRS team members Mickey Jackson, Darby Steiger, Kyle Berta, and Cameron McPhee participated in the event.   Below are the key concepts from their discussions.

Building a Real-time Attrition Risk Score for Probability Panelists

Presented by Mickey Jackson, SSRS VP of Data Science & Innovation

KEY CONCEPTS

SSRS has been exploring ways to model the risk of attrition (dropping out of the Panel by no longer responding to surveys) for Opinion Panel members. More accurate attrition models would allow us to better target operational interventions to reactivate panelists who have stopped responding to surveys. We compared multivariate random forest models, which account for a wide variety of panelist- and panelist-by-study-level predictors, to a rule-based method that simply counts the number of consecutive nonresponses for a panelist.

  • After a panelist has been invited to roughly 10 surveys, multivariate models do not outperform the simpler rule-based method at predicting whether the panelist will respond to any future surveys.
  • However, earlier in the panelist’s tenure, the multivariate models can help us identify panelists who may be at heightened risk of attrition.
  • One key predictor of attrition is the number of prior surveys for which the panelist never clicked the link to access the survey. This type of nonresponse is by far the most predictive of the probability that the panelist will not respond to future surveys. In comparison, breakoffs (when a panelist enters the survey but then leaves before completing) are much less predictive of future nonresponse.
  • Another key predictor is the frequency with which panelists are sampled for surveys early in their tenure on the Panel. The more frequently a panelist is invited into surveys after joining the Panel, the more likely they are to continue responding in the future. This is consistent with other SSRS findings on the relationship between survey burden and Panel attrition.  Read more about our work on this topic >>

Beyond the Numbers:  Methodological Considerations for Integrating Qualitative Research into a Probability Panel

Presented by Darby Steiger, SSRS VP of Innovation & Solutions

KEY CONCEPTS

SSRS has increasingly been using our Opinion Panel for qualitative research such as focus groups and in-depth interviews. This paper explored whether participation in qualitative research affects future survey participation. The good news reported at CIPHER was that this study found:

  • Participating in qualitative research has no impact on future rates of cooperation in panel surveys.
  • There is evidence that data quality and panelist satisfaction improve after participation in qualitative studies.
  • Receiving higher incentives for qualitative studies does not affect satisfaction with regular survey incentive levels.

Based on these findings, we are eager to continue leveraging our SSRS Opinion Panel for qualitative research studies including cognitive testing, in-depth interviews, focus groups, and online bulletin boards.  Read more about our approach >>

Whiplash? Measuring the Impact of Numerous Unrelated Topics on Omnibus Surveys Conducted on a Probability-based Panel

Presented by Kyle Berta, SSRS Director of Panel Products

KEY CONCEPTS

Omnibus surveys allow multiple researchers to share space on a single survey.  This creates a unique survey experience where respondents might answer questions about a wide variety of topics in a short period of time.  We examined what impact topic variety from the SSRS Opinion Panel Omnibus had on panelist behavior and found the following:

  • Increased topic variety is associated with decreased survey satisfaction ratings but generally does not impact their satisfaction with their overall panelist experience.
  • Quality control failure increases slightly at the upper limits of topic variety.
  • Topic variety has no impact on their likelihood to respond to future SSRS Opinion Panel surveys in neither the short nor the long term.

Overall, the impact of topic variety on subsequent panelist behavior is negligible. However, whenever a panel has any sort of re-occurring, high volume work, it’s important to monitor any impact that survey could be having on the panel as a whole.   Read more about the SSRS Opinion Panel Omnibus >>

DECIPHER:  Do End-users Consign Importance to ‘Probability-based’ for Household Electronic Research?

Cameron McPhee, SSRS Chief Methodologist – Panelist

KEY CONCEPTS

The aim of the panel was to discuss how we can demonstrate the value of probability panels to end-users who are tempted by the much lower-cost opt-in online sample sources. Several key themes seemed to emerge from the discussion and continued to resonate throughout the conference:

  • It is important that we help clients to determine the real risks of being wrong. When designing studies and choosing between probability and non-probability samples, the risks that the data are incorrect or biased is much greater with opt-in samples. In some circumstances, that is an acceptable level of risk, but it is important that clients evaluate the empirical and reputational risks from drawing and disseminating incorrect conclusions from the data.
  • Our current communication efforts are not effective. We need to rethink how we talk about the value of probability panels, we need to be less technical, and we need to find new audiences for our messaging.
  • We need to commit to radical transparency in our own methods and we need to help consumers and reporters of survey data to understand the importance of transparency to data quality.

Read more about the SSRS Opinion Panel methodology >>