Home > Research . Search . Country . Browse . Small Grants

PSC In The News

RSS Feed icon

Frey's Scenario F simulation mentioned in account of the Democratic Party's tribulations

U-M Poverty Solutions funds nine projects

Dynarski says NY's Excelsior Scholarship Program could crowd out low-income and minority students

More News


Workshops on EndNote, NIH reporting, and publication altmetrics, Jan 26 through Feb 7, ISR

2017 PAA Annual Meeting, April 27-29, Chicago

NIH funding opportunity: Etiology of Health Disparities and Health Advantages among Immigrant Populations (R01 and R21), open Jan 2017

Russell Sage 2017 Summer Institute in Computational Social Science, June 18-July 1. Application deadline Feb 17.

More Highlights

Next Brown Bag

Mon, Jan 23, 2017 at noon:
Decline of cash assistance and child well-being, Luke Shaefer

Brady T. West photo

Collaborative Research: Decomposing Interviewer Variance in Standardized and Conversational Interviewing

a PSC Research Project

Investigators:   Brady T. West, Frederick G. Conrad

Standardized Interviewing (SI) requires survey interviewers to read questions as worded and provide only neutral or non-directive probes in response to questions from respondents. While many major surveys in the government, non-profit, and private sectors use SI in an effort to minimize the effects of interviewers on data quality, the existing literature shows that between-interviewer variance in key survey statistics still arises despite the assignment of random subsamples to interviewers. Because this type of between-interviewer variance affects the precision of a survey estimate just like sample size, it has direct cost implications when designing a survey. Survey methodologists suspect that despite proper training in SI, interviewers may still diverge from scripts (even though they are not trained to) because additional explanation is often requested by survey respondents. Conversational Interviewing (CI) is known to handle clarification requests in a more effective manner: Interviewers are trained to read questions as worded, initially, and then say whatever is required to help respondents understand the questions. Despite literature demonstrating that CI produces noticeable decreases in the measurement error bias of survey estimates, survey researchers (and governmental agencies in particular) have been hesitant to employ it in practice, in part because of increased questionnaire administration time but also due to the fear of increased interviewer variance. The proposed research activity aims to compare the interviewer variance, bias, and mean squared error (MSE) arising in a variety of survey estimates from these two face-to-face interviewing techniques, and decompose the total interviewer variance introduced by each technique into measurement error variance and nonresponse error variance among interviewers. Doing so requires interpenetrated assignment of sampled cases to professional interviewers in addition to the presence of high-quality administrative records, and we propose an original data collection in Germany with these required features to meet our research aims.

Funding Period: 09/15/2013 to 08/31/2017

Search . Browse