The Turn to Address-Based Sampling

January 2019

Introduction

Address based sampling (ABS) has been around for some time.  For the past decade, the U.S. Postal Service has made available to approved vendors the Computerized Delivery Sequence File (CDS, CDSF or just DSF), the file principally used for mail delivery.

SSRS’s sister company, Marketing Systems Group, was one of the first approved vendors, and as such SSRS has a long history with ABS, starting with the 2008 Massachusetts Health Interview Survey and health interview surveys for other states.

ABS did not immediately catch on in many research circles.

In the late 2000s, telephone data collection was still fairly cost efficient, researchers were becoming comfortable with dual-frame (landline and cell phone) designs, and research was consistently finding that declining response rates were not significantly affecting data quality (Groves 2006; Groves and Peytcheva, 2008; Keeter et al, 2000; Keeter et al, 2006; more recently Dutwin and Buskirk, 2018).

Secondly, research also found larger systematic nonresponse among ABS respondents than telephone sample respondents (Immerwahr et al, 2018; Jones and Tsabutahsvilli, 2018; Link et al, 2008; Rapoport, Dutwin and Sherr, 2012 and 2014). Specifically, respondents on ABS studies can be more likely to be college graduates and less likely to be non-White compared with the target population as well as compared to RDD samples.

In the past decade, however, telephone response rates have continued to decline significantly, and while there is evidence that data quality from telephone samples has not deteriorated (Dutwin and Buskirk, 2018), the threat of significant systematic bias in surveys that commonly attain response rates of under 10 percent has nevertheless become a serious concern.  More strikingly, for a number of structural factors, telephone surveys are now commonly more expensive than ABS surveys, and, depending upon how ABS is operationalized, often substantially, and prohibitively, so.

At the same time, ongoing research has come up with creative solutions to mitigate the systematic nonresponse in ABS, mentioned above. First, geographic data can be appended from various Census data sources (McMichael and Ridenhour, 2018) to the address-based sample frame.  In addition, Low Response Score (LRS) data from the Census can be used to predict areas of high and low nonresponse.  Such appends can be used to oversample areas rich in persons that traditionally respond less frequently to ABS surveys.

Another option is to leverage consumer and other available data that contain indicators of potential household metrics (Jones and Tsabutahsvilli, 2018; McMichael et al, 2018; Rhodes and Marks, 2018).  Such appends are built based on actual consumer behavior or by predictive modelling, and while they will not guarantee a given metric for a household with absolute certainty, research has shown they will substantially increase the probability of the existence of that metric (for example, of a young person in the household, an African American, a person of a certain income bracket, even a cat or dog owner!).

Sample with such appends can be used to oversample and combat nonresponse.  For example, a common stratification scheme utilized by SSRS and others for a number of years leverages indicators of young persons and persons of color.  As noted in Jones and Tsabutahsvilli (2018), used modestly, such strategies can mitigate and even largely eliminate systematic nonresponse of these populations in ABS studies.  And even when there is greater nonresponse in ABS than in telephone, research suggests that proper weighting procedures significantly reduce bias (Immerwahr et al, 2018).

To be clear ABS is not a method, but a sampling frame.  Whereas in telephone surveys, the method and frame are the same, in ABS there are many choices of operationalization.

Recent research shows that a single mailing that urges respondents to complete a survey via a website, without incentives, can be a very fast, inexpensive, and accurate method to conduct political polling (Barber et al, 2014; Dutwin et al, in press; note that since those without Internet are typically not registered to vote or rarely do vote, the loss of the ability to interview persons that do not have Internet does not translate into a great loss of coverage of likely voters in this design).

On the other end of the scale are designs that pull out all the stops, perhaps, for example, a design that begins with a push to web letter with pre-incentives, offers a toll-free number to call into, appends available telephone numbers to call respondents proactively, sends multiple reminder letters, and then sends a full paper and pencil survey as a last effort to get persons to respond, with a promissory incentive for participation.  Such a design allows for response in three different modalities (web, phone, and mail), with two different incentivizations, and may vary mailing strategies (1st class versus priority, for example).

Of course, such designs must be wary of mode effects and the paradox of response (Olson, 2012, that is, that overall response can be depressed if multiple options of response are offered to the respondent), and can surpass telephone studies in costs, but at the same time, offer the promise of very high (relatively speaking in today’s world) response rates and high data quality.

As telephone response rate decline well into the single digits, many researchers must face the choice of eschewing telephone for an alternative methodology, while maintaining the methodological rigor of probability-based sampling.  ABS offers a flexible sampling frame that can be operationalized for fast surveys or those with a more measured approach; low cost low response rate surveys or surveys with response rates only surpassed by far more expensive in-person designs.

As with all methodological choices, ABS offers a number of advantages compared to its main alternative, RDD, as well as a number of challenges:

Advantages

A single sampling frame

ABS is a singular frame with, these days, excellent coverage.  Dual-frame telephone designs, by their very nature, utilize two overlapping frames, which adds a measure of uncertainty to probabilities of selection and require more complex weighting adjustments. In other words, the ABS frame provides more certainty and allows for straightforward weighting protocols.

Higher response rate (typically)

While response rates on either sampling frame will vary based on the specific operationalization of the frame, in general it ABS will attain significantly higher response rates than telephone studies (Link, 2003), particularly when ABS is administered in multiple modes.

Geographical precision

When targeting specific geographic areas (e.g., state, county, or metropolitan area), telephone samples suffer from under-coverage, meaning missing out on people living in these areas whose phone numbers are from outside of the area (a growing concern); and over-coverage, meaning calling numbers in the sample frame whose owners live outside of the target area. Under-coverage may lead to systematic bias in the sample, while over-coverage leads to higher costs. ABS addresses both problems by targeting the household rather than a telephone exchange. Furthermore, the geographical precision of ABS opens up a great deal of possibilities for geographic stratification and targeting by Census data at the block group level.  Indeed, these days surveys of particularly difficult geographies like Congressional Districts are straightforward in ABS and functionally impossible with telephone frames.

Challenges

Control over number of interviews

If there is one major disadvantage of ABS it is the relative difficulty in getting a precise number of interviews from most ABS applications.  In telephone surveys, if a study is falling short of its targets one can place more call attempts, retrain interviewers, or release additional sample nearly instantly.  In ABS, it can often be quite difficult to predict what “yield” (the number of addresses to invite to get a single interview) a study will get, as yield will vary greatly by how ABS is operationalized, the sponsor, the topic, the design of the materials, etc.  ABS designs typically send out invitation mailings and then survey firms have very little they can do other than wait for results.  If more sample is required, it could mean additional printings, mailings, delays, and considerable additional expense.

Targeting low-incidence populations

ABS has inherently more challenges in capturing low incidence populations.  It remains true that telephone interviewing can be relatively efficient at screening households for an eligible low-incidence respondent.  In ABS, costs can skyrocket for low incidence research, since a 10% incidence necessarily will require a tenfold increase in mailings, printing, etc. then a 100% incidence.  ABS designs sometimes will utilize a two-stage protocol (a screening interview and then a main interview) but again this will drive up costs and lower the overall response rate.

Time in the field

Historically, ABS designs have required relatively lengthier field periods due to the time necessary to roll out the mailing and consequent contact attempts. However, the field-period is highly dependent upon how ABS is administered, and ABS can be fielded in a week or less if needed.

Turning to ABS

The expertise required to execute ABS studies is different than telephone.  For one, the need for telephone interviewers and the infrastructure to hire, train, and maintain them is either no longer required or greatly minimized, depending on the design of a given ABS study.  Secondly a wealth of knowledge in terms of sample design, minimizing mode effects, effective web interfacing, to name a few, is required to conduct ABS at a high level of professionalism.  As studies transition from telephone to ABS, great care should be taken to ensure that trends in the data are effectively maintained through the transition.

SSRS’s view is that any study that measures trends would include a “gap year” whereby one instantiation of the study is conducted part telephone and part ABS, before transitioning fully to ABS.

If point estimates do change from one method to the other, such a transitional study can best investigate the potential causes of the change and perhaps revise future occurrences of the study to mitigate such changes in trend.  There is little to no published research on transitions, but having a hybrid design, part telephone and part ABS, or alternatively, the fielding of an ABS pilot to produce some side-by-side data to telephone, can yield significant insights to understand and address changes in trend.

Notably, SSRS was one of the first to field ABS studies with the CDSF and has accrued a wealth of expertise in AB sampling as a whole, the deployment of a variety of research modes using AB sample, and also transitioning telephone sample surveys to ABS designs.  We welcome the opportunity to consult with you on your future ABS research.

References

Barber, M., Mann, C., Monson, J.Q., and Patterson, K. (2014). Online Polls and Registration-Based Sampling: A New Method for Pre-Election Polling. Political Analysis, doi:10.1093/pan/mpt023.

Dutwin, D., and Buskirk, T. (2018). Telephone Sample Surveys: Dearly Beloved or Nearly Departed? Trends in Survey Errors in the Era of Declining Response Rates. Under Peer Review.

Dutwin, D. (2019). One shot, aim true: Comparisons of telephone and mail surveys for political polling.  A paper presented at the 2019 American Association for Public Opinion Research, Toronto, CA.

Groves, Robert M. (2006). Nonresponse rates and nonresponse bias in household surveys. Public Opinion Quarterly, 70 (5): 646-75.

Groves, Robert. M. and Peytcheva, Emilia (2008).  The impact of nonresponse rates on nonresponse bias. Public Opinion Quarterly 72: 167-189.

Immerwahr, S., Lim, S., Seligson, A. (2018). Is ABS More Representative than RDD for Public Health Surveillance Surveys? Paper presented at the 2018 Conference of the American Association for Public Opinion Research, Denver, CO.

Jones, J., and Tsabutahsvilli, D. (2018). Strategies for Oversampling Hard-to-Reach Respondents in Mail Surveys. Paper presented at the 2018 Conference of the American Association for Public Opinion Research, Denver, CO.

Keeter, Scott, Miller, Carolyn, Kohut, Andrew, Groves, Robert M., and Presser, Stanley (2000). Consequences of reducing nonresponse in a large national telephone survey. Public Opinion Quarterly, 67: 125-48.

Keeter, Scott, Kennedy, Courtney, Dimock, Michael, Best, Jonathan, and Craighill, Peyton (2006). Gauging the impact of growing nonresponse on estimates from a national RDD telephone survey. Public Opinion Quarterly, 70 (5): 759-779. 

Link, M., Battaglia, M., Frankel, M., Osborn, L., and Mokdad, A. (2003). A comparison of address-based sampling versus random-digit dialing for general population surveys.  Public Opinion Quarterly, 72 (1), 6-27.

Link, M., Battaglia, M., Frankel, M., Osborn, L., and Mokdad, A. (2003). Address-based versus Random-Digit-Dial Surveys: Comparison of Key Health and Risk Indicators. Practice of Epidimeology, 164 (10), DOI: 10.1093/aje/kwj310.

McMichael, J., and Ridenhour, J. (2018). Improving Demographic Information for Address Based Sampling (ABS) frames. Paper presented at the 2018 Conference of the American Association for Public Opinion Research, Denver, CO.

McMichael, J., Harter, R., Shook-Sa, B., Ridenhour, J., and Morris, J. (2018). Evaluation of Combining Consumer Marketing Data Used for Address-Based Sampling. Paper presented at the 2018 Conference of the American Association for Public Opinion Research, Denver, CO.

Olson, K., Smyth, J., Wood, H. (2012). Does giving people their preferred survey mode actually increase survey participation rates? An experimental Examination, Public Opinion Quarterly, 76, 611-635.

Rapoport, R., Sherr, S., and Dutwin, D., “Address Based Samples: Key Factors in Refining this Research Methodology.” SSRS Whitepaper Archive, September 2014.

Rhodes, B., and Marks, E. (2018).  Who do you think we are?  Using ancillary data to identify racial and ethnic subgroups. Paper presented at the 2018 Conference of the American Association for Public Opinion Research, Denver, CO.

Sherr, S., Rapoport, R., and Dutwin, D (2012).  Does Ethnically Stratified Address-Based Sample Result in Both Ethnic and Class Diversity; Case Studies in Oregon and Houston. Presented at the annual conference of the American Association of Public Opinion Research in Orlando, Fl.

 

ABOUT THE AUTHOR

David Dutwin, Ph.D.

SSRS EVP & Chief Methodologist

David Dutwin, Ph.D., is primarily responsible for sampling designs, project management, executive oversight, weighting and statistical estimation.  He is an active member of the survey research community, having served in the American Association for Public Opinion Research as a member and a chair of special task forces, a member of the Standards, Communications, and Heritage Committees; teaching multiple short courses and webinars; as the Student Paper winner of 2002; and as the 2016 Conference Chair.  He was elected to the AAPOR Executive Council in 2017 and serves as the 2017 Vice President/2018 President.  David is a Senior Fellow with the Program for Opinion Research and Election Studies at the University of Pennsylvania.

He holds a Masters of Communication from the University of Washington and his doctorate in Communication and Public Opinion from the Annenberg School for Communication at the University of Pennsylvania.  David attained his Bachelors in Political Science and Communication from the University of Pittsburgh

He has taught Research Methods, Rhetorical Theory, Media Effects and other courses as an Adjunct Professor at West Chester University and the University of Pennsylvania for over a decade.  David is also a Research Scholar at the Institute for Jewish and Community Research.  His publications are wide-ranging, including a 2008 book on media effects and parenting; methodology articles for Survey Practice, the MRA magazine Alert!, and other publications; and a range of client reports, most recently on Hispanic acceptance of LGBT, which he presented to a Congressional briefing in 2012.

Get the PDF

Want more information?