Your Source for NPR News & Music
Play Live Radio
Next Up:
0:00
0:00
0:00 0:00
Available On Air Stations

The Hidden Discrimination In Criminal Risk-Assessment Scores

KELLY MCEVERS, HOST:

Imagine if you could use an algorithm to predict trouble and maybe avoid it. The idea has gained traction throughout the criminal justice system from police departments to courtrooms. It's called criminal risk assessment.

AUDIE CORNISH, HOST:

The nonprofit news outlet ProPublica looked at one of the most widely used risk-assessment programs and how it fared in Broward County, Fla. There people who have been arrested are given a questionnaire and then a score from 1 to 10.

MCEVERS: Four or higher suggests they are likely to reoffend. Reporter Julia Angwin says the analysis was accurate about 61 percent of the time, and it treated blacks and whites differently. I asked her to describe how the algorithm works.

JULIA ANGWIN: The one that we were looking at is put together by a proprietary software company named Northpointe. They don't disclose their exact formula, but they told us sort of generally what went into, which is essentially your criminal history, your education level, whether you have a job. And then there's a whole bunch of questions that have to do with your criminal thinking. So, for instance, do you agree or disagree that it's OK for a hungry person to steal? And not every assessment asks that. But the core product that Northpointe offers is 137 questions long. And then jurisdictions can use any subset.

MCEVERS: Other questions, like, you know, were your parents ever sent to jail or prison? How many of your friends are taking drugs illegally? How often did you - do you get in fights while at school? Stuff like that, right?

ANGWIN: Yeah. there are questions are a lot about your family, your attitudes. Do you feel bored often? Do you have anger issues? Did your family - ever been arrested?

MCEVERS: Does the questionnaire ask specifically about your race?

ANGWIN: No, it does not ask about your race.

MCEVERS: You talked to some hardened criminals who were rated as relatively low risk. Even they were surprised to be rated this way. How did that happen?

ANGWIN: This one guy that we wrote about, Jimmy Rivelli (ph), he is in his 50s, white guy, who, by his own account, has led a life of crime, mostly petty thefts and mostly to fuel his drug addiction, which he's struggling to overcome. But when I told him he was rated low risk, his reaction was that's a very big surprise to me because I had just got out of five years in state prison for drug trafficking when I was arrested for that.

MCEVERS: Wow. And so how is it someone like that can be given such a low rating?

ANGWIN: So we analyzed more than 7,000 scores to see what was causing these types of disparities. And what we found was that although this algorithm is actually OK at predicting general whether you're going to commit another crime within the next two years, it's actually inaccurate in this way where it fails differently for blacks and whites. So black defendants are twice as likely to be rated high risk incorrectly, meaning they did not go on to reoffend. And white defendants are twice as likely to be rated incorrectly as low risk and yet go on to reoffend.

MCEVERS: What does it mean for someone who is incorrectly tagged by this risk-assessment system?

ANGWIN: So basically, if you're given a low-risk score, this is given to the judge. And in Florida, where we were looking at the scores, the judge looks at that while making a decision about pretrial release, meaning can you get out of jail on bond while awaiting trial for your crime? In other jurisdictions where the same exact software is used, this score is used for sentencing.

So when you've been convicted of a crime, the judge gets a secret report, a presentencing investigation, usually sealed to the public, which says this person is high risk. You should take that into consideration when making your sentencing. And those decisions are very important because if you are judging - you're looking at this high-risk thing, you might be inclined - and this has happened - to be more likely to put that person in a longer prison sentence.

MCEVERS: We should say that the company, Northpointe, that administers these tests disputes your findings.

ANGWIN: Yes, that is correct.

MCEVERS: Based on all the evidence that you looked at and all the experts that you talked to, do you think there's a place for these types of algorithms in the criminal justice system?

ANGWIN: The movement towards risk-assessment scores is very well-intentioned, and I'm sympathetic with the idea that we need to make the criminal justice system more objective and more fair. And it hasn't been that in the past. But having looked at the outcomes from this particular scoring system, I'm not sure that this is actually making that many more steps towards a fairer system.

When you have a poor outcome that is potentially very damaging to somebody, meaning that called a high risk when they're not that is happening twice as often to black defendants as white defendants, I am not sure that's leading to a fairer system.

MCEVERS: That's Julia Angwin of ProPublica. Thanks so much.

ANGWIN: Thank you. Transcript provided by NPR, Copyright NPR.

Related Stories