Sonja B. Starr

Starr critical of risk assessment scores for sentencing

an In The Media Reference

"This Guy Trains Computers to Find Future Criminals" - Bloomberg. 7/18/2016.

Some courts use automatically generated risk scores in sentencing and paroling criminals. Generated by algorithms using factors such as prior arrests, types of crimes, and demographic information, risk scores are intended to help judges overcome unconscious bias. But critics like Sonja Starr think they can hide prejudices beneath a façade of computerized objectivity. Starr says that although social science research uses socioeconomic and demographic variables to predict population-level behaviors, these models should not be used to calculate a single person’s future, especially if the variables used are influenced by decades of racial and socioeconomic disparities. It amounts to a computerized version of racial profiling. Starr says she and others are particularly concerned that "every mark of poverty serves as a risk factor" in these assessments.

Researchers:

Sonja B. Starr

"In the News" Archive

Next


Connect with PSC follow PSC on Twitter Like PSC on Facebook