Home > Events & News > Brown Bag Schedule . Archive

PSC In The News

RSS Feed icon

Starr's findings account for some of the 19% black-white gap in federal sentencing

Frey says suburbs are aging, cities draw millennials

Pfeffer comments on Fed report that reveals 20-year decline in net worth among American families

More News

Highlights

U-M's campus climate survey results discussed in CHE story

U-M honors James Jackson's groundbreaking work on how race impacts the health of black Americans

U-M is the only public and non-coastal university on Forbes' top-10 list for billionaire production

ASA President Bonilla-Silva takes exception with Chief Justice Roberts' 'gobbledygook' jab

More Highlights

Next Brown Bag

Mon, Jan 22, 2018, noon: Narayan Sastry

Sonja B. Starr

Starr critical of risk assessment scores for sentencing

a PSC In The News reference

"This Guy Trains Computers to Find Future Criminals" - Bloomberg. 07/18/2016.

Some courts use automatically generated risk scores in sentencing and paroling criminals. Generated by algorithms using factors such as prior arrests, types of crimes, and demographic information, risk scores are intended to help judges overcome unconscious bias. But critics like Sonja Starr think they can hide prejudices beneath a façade of computerized objectivity. Starr says that although social science research uses socioeconomic and demographic variables to predict population-level behaviors, these models should not be used to calculate a single person’s future, especially if the variables used are influenced by decades of racial and socioeconomic disparities. It amounts to a computerized version of racial profiling. Starr says she and others are particularly concerned that "every mark of poverty serves as a risk factor" in these assessments.

Researcher:

Sonja B. Starr

View News Archive