Home > Research . Search . Country . Browse . Small Grants

PSC In The News

RSS Feed icon

Kruger says reports of phantom mobile phone ringing/vibrating more common among anxious

Stafford says too early to say whether stock market declines will curtail Americans' spending

Eisenberg says many colleges now train campus personnel to spot and refer troubled college students


Call for papers: Conference on Integrating Genetics and the Social Sciences, Oct 21-22, 2016, CU-Boulder

PRB training program in policy communication for pre-docs. Application deadline, 2.28.2016

Call for proposals: PSID small grants for research on life course impacts on later life wellbeing

PSC News, fall 2015 now available

Next Brown Bag

Monday, Feb 1 at noon, 6050 ISR-Thompson
Sarah Miller

Improving the Design of Health Surveys on the Web

a PSC Research Project [ARCHIVE DISPLAY]

Investigators:   Frederick G. Conrad, Mick P. Couper

The goal of this project is to discover principles for improving interface designs for web surveys and similar applications. We focus on principles for web surveys, but believe our studies will have broader implications for the design of web pages that convey information as well as capture it and for other interfaces as well. Our proposed studies examine three areas — screen design, methods for conveying definitions, and differences between experts and novices. All three areas raise important theoretical and practical questions, and all three have been the subject of relatively little prior research. Our first aim is to determine the principles governing the best arrangement of material on a web page. Users prefer intermediate levels of visual complexity, but other features (such as a logical arrangement of the visual elements) are more critical than sheer complexity. Our second aim is to determine the most effective methods for conveying definitions and other instructions. Prior research suggests that respondents often ignore definitional material for a variety of reasons — they are reluctant to expend the effort to attend to definitional material, they may not think they need it, and they may be unwilling to apply the definition when, in fact, it would be useful. There has been little work on how to improve the effectiveness of definitional material. We examine several approaches — stating a general rule, mentioning specific examples of the category of interest, and avoiding definitions altogether by asking very specific questions. Our third aim is to investigate differences between expert and novice users. Many interfaces are designed to accommodate novice or even first-time users and thus may include features that are inefficient for more expert users. In addition, experts may fall into habits that are reasonably effective but not optimal. We propose four experiments to contrast the performance of experts and novice users, to demonstrate the poor performance of experts with many interfaces, and to develop methods for accommodating both types of users. Although our basic method for exploring each of these issues is to conduct experiments embedded in large-scale, realistic web surveys, we plan to collect various types of “paradata” (such as response times and patterns of mouse clicks) in addition to answers to the survey questions. We propose several follow-up experiments that use eye-tracking equipment to clarify what information respondents actually attended to as they completed the surveys.

This project is motivated by the explosive growth of the web as a vehicle for collecting and disseminating public health information. Our studies will help public health researchers gather more accurate information; in addition, our results will also be useful to health workers who are using the web to convey public health information or to deliver health interventions.

Funding Period: 05/01/2007 to 04/30/2012

Country of Focus: USA

Related Publication:

Galesic, M., Roger Tourangeau, Mick P. Couper, and Frederick G. Conrad. 2008. "Eye-Tracking Data: New Insights on Response Order Effects and Other Cognitive Shortcuts in Survey Responding." Public Opinion Quarterly, 72(5): 892-913. PMCID: PMC3022327.

This PSC Archive record is displayed for historical reference.

Search . Browse