Algorithmic Bail Risk Assessment
This is a news story, published by Verge, that relates primarily to Sino Esthappan's news.
Sino Esthappan's news
For more Sino Esthappan's news, you can click here:
more Sino Esthappan's newsNews about Ai research
For more Ai research news, you can click here:
more Ai research newsVerge news
For more news from Verge, you can click here:
more news from VergeAbout the Otherweb
Otherweb, Inc is a public benefit corporation, dedicated to improving the quality of news people consume. We are non-partisan, junk-free, and ad-free. We use artificial intelligence (AI) to remove junk from your news feed, and allow you to select the best tech news, business news, entertainment news, and much more. If you like this article about Ai research, you might also like this article about
other algorithmic criminal justice tools. We are dedicated to bringing you the highest-quality news, junk-free and ad-free, about your favorite topics. Please come every day to read the latest overburdened criminal justice system news, criminal justice systems news, news about Ai research, and other high-quality news about any topic that interests you. We are working hard to create the best news aggregator on the web, and to put you in control of your news feed - whether you choose to read the latest news through our website, our news app, or our daily newsletter - all free!
human judgesVerge
•Judges let algorithms help them make decisions, except when they don’t
83% Informative
Algorithmic risk assessments are intended to calculate the risk of a criminal defendant not returning to court.
They're supposed to help judges gauge how risky releasing someone from jail would be.
Northwestern University graduate student Sino Esthappan's study shows that's a flawed assumption at best.
Interviews revealed patterns in judges’ decisions to use risk assessment scores.
Some judges believed the systems underestimated the importance of certain red flags.
Judges also said they used the scores as a matter of efficiency in short hearings.
Judges were keenly aware of how a decision would reflect on them.
Algorithmic tools are aiming to address a real issue with imperfect human decision-making.
“There’s an issue that can’t necessarily be fixed with risk assessments, but that it goes into a deeper cultural issue within criminal courts,” Esthappan says.
VR Score
89
Informative language
91
Neutral language
52
Article tone
informal
Language
English
Language complexity
57
Offensive language
possibly offensive
Hate speech
not hateful
Attention-grabbing headline
not detected
Known propaganda techniques
not detected
Time-value
long-living
External references
2
Source diversity
2
Affiliate links
no affiliate links