Home > Research . Search . Country . Browse . Small Grants

PSC In The News

RSS Feed icon

Work by Geronimus cited in account of Serena Williams' maternal health complications

Alexander and Massey compare outcomes for children whose parents did and did not take part in Great Migration

Geronimus on pushing past early dismissal of her weathering hypothesis

More News


AA named 2018 Best Place to Live in America (out of 100 cities)

Remembering Jim Morgan, founding member of ISR and creator of the PSID

1/17/18: ISR screening and discussion of documentary "Class Divide" at Michigan Theater

Bailey et al. find higher income among children whose parents had access to federal family planning programs in the 1960s and 70s

More Highlights

Next Brown Bag

Mon, Jan 22, 2018, noon: Narayan Sastry

Ben Hansen photo

Statistics back end for Gates evaluation engine

a PSC Research Project [ARCHIVE DISPLAY]

Investigator:   Ben Hansen

Many educational interventions can be evaluated informatively by comparing administrative outcomes of intervention-group students to those of comparison groups consisting of similar students. If similarity is understood in terms of variables available to state educational agencies, so that the evaluation is to be driven by comparisons among students matched on administrative variables, then evaluations of this kind ought to be simple to perform; in practice, however, gaining access to these legally protected databases is costly and time consuming, and may require masking of some of the data. The Gates Foundation's Evaluation Engine will drastically streamline the process of performing matched comparisons using administrative education data by presenting a web-interface, database and statistical analysis program enabling researchers to conduct statistical analyses on FERPA-protected data that they have no direct access to. This project develops and contributes an important component of the Gates Evaluation Engine, the programs and routines it uses to prepare and present statistical analyses.
In a typical Evaluation Engine usage scenario, the researcher presents student identifiers for an intervention group enrolled in schools in a given state. Then the Evaluation Engine identifies the students within a protected state database, finds comparable students not belonging to the intervention group from within that same states database, and performs a suitable statistical analysis comparing outcomes among the intervention group students and the comparison group students. Results of this analysis are then presented to the researcher in an automatically generated report.
This project consists of developing statistical methods and software for the specifically statistical aspects of these Evaluation Engine tasks, namely: assembling, preparing and periodically updating the state databases; in real time finding suitable matches for the members of an intervention group from among the complementary collection of students within the same state; validating collections of matches in ways that are intelligible and useful to researchers; comparing intervention subjects and their matched controls for overall differences (on outcome measures appearing in the database) and for selected subgroup differences, with appropriate adjustment for residual differences left by the matching procedure. The team executing the project is led by Ben Hansen, associate professor of statistics and research associate of the Education and Well-Being Group, Survey Research Center, Institute for Social Research, University Of Michigan. Also contributing are Stephen Schilling, Research Scientist, Education and Well-Being Group, who brings expertise in psychometrics and in algorithms for psychometric modeling; statistical programmers Mark Fredrickson and Peter Solenberger; a survey research operations specialist who will arrange protocols for data transfer between contributing state agencies and the project team; and one or several University of Michigan students who will contribute to our quality assurance program.
The statistical back-end will be developed in communication with a separate development team responsible for the user interface; with David Silver, the responsible program officer at the Gates Foundation; and with state agencies who have agreed to share data for the project. The development will occur in three stages. The first stage, on which work will begin immediately, is the development of a prototype back-end program for the user interface team to coordinate with in its programming efforts; this prototype will be completed during summer 2012, probably in July. The second stage will involve developing a first working version of the statistical program itself; work will commence on this shortly after the first state databases arrive and will continue into late 2012. At the third stage we'll add statistical enhancements and modifications to improve the

Funding Period: 06/15/2012 to 12/31/2014

This PSC Archive record is displayed for historical reference.

Search . Browse