Home > Events & News > Brown Bag Schedule . Archive

PSC In The News

RSS Feed icon

Work by Geronimus cited in account of Serena Williams' maternal health complications

Alexander and Massey compare outcomes for children whose parents did and did not take part in Great Migration

Geronimus on pushing past early dismissal of her weathering hypothesis

More News

Highlights

Robert Wood Johnson Foundation health leadership development programs accepting applications

AA named 2018 Best Place to Live in America (out of 100 cities)

Remembering Jim Morgan, founding member of ISR and creator of the PSID

1/17/18: ISR screening and discussion of documentary "Class Divide" at Michigan Theater

More Highlights

Next Brown Bag

Mon, Jan 22, 2018, noon: Narayan Sastry

Sonja B. Starr

Starr critical of risk assessment scores for sentencing

a PSC In The News reference, 2016

"This Guy Trains Computers to Find Future Criminals" - Bloomberg. 07/18/2016.

Some courts use automatically generated risk scores in sentencing and paroling criminals. Generated by algorithms using factors such as prior arrests, types of crimes, and demographic information, risk scores are intended to help judges overcome unconscious bias. But critics like Sonja Starr think they can hide prejudices beneath a façade of computerized objectivity. Starr says that although social science research uses socioeconomic and demographic variables to predict population-level behaviors, these models should not be used to calculate a single person’s future, especially if the variables used are influenced by decades of racial and socioeconomic disparities. It amounts to a computerized version of racial profiling. Starr says she and others are particularly concerned that "every mark of poverty serves as a risk factor" in these assessments.

Researcher:

Sonja B. Starr

View News Archive