Home > Events & News > Brown Bag Schedule . Archive

PSC In The News

RSS Feed icon

Buchmueller says employee wages are hit harder than corporate profits by rising health insurance costs

Davis-Kean et al. link children's self-perceptions to their math and reading achievement

Yang and Mahajan examine how hurricanes impact migration to the US

More News

Highlights

Pamela Smock elected to PAA Committee on Publications

Viewing the eclipse from ISR-Thompson

Paula Fomby to succeed Jennifer Barber as Associate Director of PSC

PSC community celebrates Violet Elder's retirement from PSC

More Highlights

Next Brown Bag

Mon, Sept 11, 2017, noon:
Welcoming of Postdoctoral Fellows: Angela Bruns, Karra Greenberg, Sarah Seelye and Emily Treleaven

Sonja B. Starr

Starr critical of risk assessment scores for sentencing

a PSC In The News reference

"This Guy Trains Computers to Find Future Criminals" - Bloomberg. 07/18/2016.

Some courts use automatically generated risk scores in sentencing and paroling criminals. Generated by algorithms using factors such as prior arrests, types of crimes, and demographic information, risk scores are intended to help judges overcome unconscious bias. But critics like Sonja Starr think they can hide prejudices beneath a façade of computerized objectivity. Starr says that although social science research uses socioeconomic and demographic variables to predict population-level behaviors, these models should not be used to calculate a single person’s future, especially if the variables used are influenced by decades of racial and socioeconomic disparities. It amounts to a computerized version of racial profiling. Starr says she and others are particularly concerned that "every mark of poverty serves as a risk factor" in these assessments.

Researcher:

Sonja B. Starr

View News Archive