Home > Publications . Search All . Browse All . Country . Browse PSC Pubs . PSC Report Series

PSC In The News

RSS Feed icon

Prescott says sex offender registries may increase recidivism by making offender re-assimilation impossible

Frey says rising numbers of younger minority voters mean Republicans must focus on fiscal not social issues

Work by Garces and Mickey-Pabello cited in NYT piece on lack of black physicians

Highlights

Elizabeth Bruch wins Robert Merton Prize for paper in analytic sociology

Elizabeth Bruch wins ASA award for paper in mathematical sociology

Spring 2015 PSC newletter available now

Formal demography workshop and conference at UC Berkeley, August 17-21

Next Brown Bag

PSC Brown Bags will be back fall 2015


Measurement Error, Unit Nonresponse, and Self-Reports of Abortion Experiences

Archived Abstract of Former PSC Researcher

Peytchev, Andy, Emilia Peytcheva, and Robert M. Groves. 2010. "Measurement Error, Unit Nonresponse, and Self-Reports of Abortion Experiences." Public Opinion Quarterly, 74(2): 319-327.

Survey designs for producing population prevalence estimates, such as abortion rates, need to consider multiple sources of error. Abortion prevalence has been found to suffer from underreporting. Abortion rates can also be underestimated due to unit nonresponse. However, it is when these two phenomena are linked that it becomes a particularly critical problem for both researchers interested in studying abortion rates and survey research practitioners. For substantive researchers, results may depend on the mix of these survey errors, in unexpected ways. For survey methodologists, decreasing one source of error may lead to greater bias in estimates by increasing the influence of another source. Identifying common causes can inform when to expect differences in rates due to the mix of nonresponse and measurement error, while helping practitioners design surveys that reduce both sources of error. This article addresses both nonresponse and measurement error in abortion estimates. We found that those with a lower likelihood to participate in the survey were also more likely to underreport such experiences. We interpret both nonresponse and measurement error as stemming from a common cause: the likely social stigma that reporting of these experiences poses to individuals. Although these results show that naïve increasing of response rates may lead to greater bias in survey estimates, we also find limited evidence that some survey design changes can reduce the link between nonresponse and measurement error. This finding is worthy of replication for practitioners in order to break the link between these sources of error.

DOI:10.1093/poq/nfq002 (Full Text)

http://poq.oxfordjournals.org/cgi/content/full/nfq002v1

Country of focus: United States of America.

Browse | Search : All Pubs | Next