logo
welcome
Verge

Verge

Judges let algorithms help them make decisions, except when they don’t

Verge
Summary
Nutrition label

83% Informative

Algorithmic risk assessments are intended to calculate the risk of a criminal defendant not returning to court.

They're supposed to help judges gauge how risky releasing someone from jail would be.

Northwestern University graduate student Sino Esthappan's study shows that's a flawed assumption at best.

Interviews revealed patterns in judges’ decisions to use risk assessment scores.

Some judges believed the systems underestimated the importance of certain red flags.

Judges also said they used the scores as a matter of efficiency in short hearings.

Judges were keenly aware of how a decision would reflect on them.

Algorithmic tools are aiming to address a real issue with imperfect human decision-making.

“There’s an issue that can’t necessarily be fixed with risk assessments, but that it goes into a deeper cultural issue within criminal courts,” Esthappan says.

VR Score

89

Informative language

91

Neutral language

52

Article tone

informal

Language

English

Language complexity

57

Offensive language

possibly offensive

Hate speech

not hateful

Attention-grabbing headline

not detected

Known propaganda techniques

not detected

Time-value

long-living

Affiliate links

no affiliate links