Home > Publications . Search All . Browse All . Country . Browse PSC Pubs . PSC Report Series

PSC In The News

RSS Feed icon

Surprising findings on what influences unintended pregnancy from Wise, Geronimus and Smock

Recommendations on how to reduce discrimination resulting from ban-the-box policies cite Starr's work

Brian Jacob on NAEP scores: "Michigan is the only state in the country where proficiency rates have actually declined over time."

More News

Highlights

Call for papers: Conference on computational social science, April 2017, U-M

Sioban Harlow honored with 2017 Sarah Goddard Power Award for commitment to women's health

Post-doc fellowship in computational social science for summer or fall 2017, U-Penn

ICPSR Summer Program scholarships to support training in statistics, quantitative methods, research design, and data analysis

More Highlights

Next Brown Bag

Mon, March 13, 2017, noon:
Rachel Best

James Wagner photo

Use of Paradata in a Responsive Design Framework to Manage a Field Data Collection

Publication Abstract

Wagner, James, Brady T. West, N. Kirgis, James M. Lepkowski, William Axinn, and S. Ndiaye. 2012. "Use of Paradata in a Responsive Design Framework to Manage a Field Data Collection." Journal of Official Statistics, 28(4): 477-499.

In many surveys there is a great deal of uncertainty about assumptions regarding key design parameters. This leads to uncertainty about the cost and error structures of the surveys. Responsive survey designs use indicators of potential survey error to determine when design changes should be made on an ongoing basis during data collection. These changes are meant to minimize total survey error. They are made during the field period as updated estimates of proxy indicators for the various sources of error become available. In this article we illustrate responsive design in a large continuous data collection: the 2006-2010 U.S. National Survey of Family Growth. We describe three paradata-guided interventions designed to improve survey quality: case prioritization, "screener week," and sample balance. Our analyses demonstrate that these interventions systematically alter interviewer behavior, creating beneficial effects on both efficiency and proxy measures of the risk of nonresponse bias, such as variation in subgroup response rates.

Browse | Search : All Pubs | Next