Mon, March 20, 2017, noon:
Dean Yang, Taken by Storm
Tourangeau, Roger, Robert M. Groves, and C.D. Redline. 2010. "Sensitive Topics and Reluctant Respondents: Demonstrating a Link between Nonresponse Bias and Measurement Error." Public Opinion Quarterly, 74(3): 413-432.
Survey researchers have long speculated that there may be a link between nonresponse and measurement error-that is, people likely to become nonrespondents to a survey are also likely to make poor reporters if they do take part. Still, there is surprisingly little evidence of such a link. It could be that nonresponse is generally the product of one set of factors and reporting errors, the product of an unrelated set, or both nonresponse and reporting errors may be item-specific so that no general relationship between the two is likely to emerge. Our study examined a situation in which we thought there would be a link between response propensities and the propensity to give inaccurate answers. We asked samples of voters and nonvoters to take part in a survey that included items about voting. Past research shows that nonvoters misreport that fact and that they are less likely than voters in general to take part in surveys. We thought we could heighten the differences between voters and nonvoters in both response rates and levels of misreporting if we characterized the survey as being about politics. However, only nonresponse biases were larger when the topic of the survey was described as political, and this difference was only marginally significant. These two ways of framing the study had even smaller effects on estimates derived from other items in the questionnaire. The overall biases in estimates derived from the voting items are very substantial, and both nonresponse and measurement error contribute to them.
Country of focus: United States of America.