KTEP - El Paso, Texas

Can Software That Predicts Crime Pass Constitutional Muster?

Jul 26, 2013
Originally published on July 29, 2013 10:01 am

Typically, police arrive at the scene of a crime after it occurs. But rather than send cops to yesterday's crime, a new trend in law enforcement is using computers to predict where tomorrow's crimes will be — and then try to head them off.

The software uses past statistics to project where crime is moving. Police in Los Angeles say it's worked well in predicting property crimes there. Now Seattle is about to expand it for use in predicting gun violence.

It all started as a research project. Jeff Brantingham, an anthropologist at UCLA, wanted to see if computers could model future crime the same way they model earthquake aftershocks. Turns out they can.

"It predicts sort of twice as much crime as any other existing system, even going head-to-head with a crime analyst," Brantingham says.

Checking The Boxes

Older systems, like the famous CompStat in New York, show where crime has been. This system looks forward.

"The model will actually predict other locations, that effectively say, even though there was a crime somewhere else in your environment, the risk is still greatest in this location today for the next 10 hours or the next 12 hours," Brantingham explains.

Brantingham and his colleagues are now selling the predictive system to police departments with the name PredPol. At this point, you may be thinking about the sci-fi movie Minority Report. But this is different. No psychics sleeping in bathtubs, for one. More to the point, this doesn't predict who will commit a future crime, just where it is likely to happen.

In Seattle, police Sgt. Christi Robbin zooms in on a map of the city. Earlier this year, Seattle started using PredPol to predict property crimes. It's now the first place to try predicting gun violence with the software.

"These red boxes [on the map] are predictions of where the next crimes are likely to occur," Robbin explains.

At the start of every shift, patrol cops are assigned to those red boxes. "So we're asking that they spent the time in that 500-by-500-square-foot box, doing whatever proactive work they can to prevent that crime," Robbin says.

On a recent shift, officer Philip Monzon pulls up inside his box; today, it's a city block near the Seattle waterfront.

"[The police] want visibility, they want contacts with businesses as are appropriate, and anyone who's wandering through the area," Monzon explains.

This area has parking lots, and PredPol's forecast includes car thefts. As Monzon passes a green Honda, he pauses. The guy inside seems to be ducking under the dashboard.

"[I] wanna make sure to see if he's got the key or if he's gonna pull out anytime soon," Monzon says.

The car starts — the guy probably does have the key. But why didn't Monzon challenge him, just in case?

"I don't really have enough — I'm not just going to single out one guy in a Honda," he explains.

Computer Models And 'Reasonable Suspicion'

And this is where this gets tricky. The courts say police need "reasonable suspicion" in order to stop somebody. That suspicion can come from a lot of things — even someone's "furtive movements," as police like to say.

But can it come from the fact that someone is occupying an imaginary red box drawn by a computer?

"Ah — no. No. I don't know. I wouldn't make a stop solely on that," Monzon says.

That's probably the right answer, says Andrew Guthrie Ferguson, a law professor at the University of the District of Columbia who has taken a special interest in the constitutional implications of PredPol. He says the departments using it have told police not to use it as a basis for stops. But he also wonders how long that can last.

"The idea that you wouldn't use something that is actually part of the officer's suspicion and not put that in — [that] may come to a head when that officer is testifying," Ferguson says. Either that officer will have to omit the fact that he or she was prompted by PredPol, he says, or that officer will admit it on the stand. "Then the issue will be raised for the court to address."

And it may be that PredPol is a constitutional basis for stopping someone. Some might consider it more objective than an individual police officer's judgment — less prone to racism or other kinds of profiling, for example.

Ferguson says that argument may have merit, but that police and society still need to be careful.

"I think most people are gonna defer to the black box," he says. "Which means we need to focus on what's going into that black box, how accurate it is, and what transparency and accountability measures we have [for] it."

In other words, even though computers aren't biased, the statistics feeding it might be. And if police are going to follow an algorithm, we should at least be willing to check the math.

Copyright 2018 NPR. To see more, visit http://www.npr.org/.

ROBERT SIEGEL, HOST:

Some police departments are trying to predict the future. They're using software that lays out statistics from the past to project where crime is moving. Police in Los Angeles say it has worked well in predicting property crimes. Now, Seattle is about to expand it to gun violence. But as we hear from NPR's Martin Kaste, police are hesitant about relying too much on computers.

MARTIN KASTE, BYLINE: This all started as a research project. An anthropologist at UCLA, Jeff Brantingham, wanted to see if computers could model future crime, the same way they model earthquake aftershocks. Turns out they can.

DR. JEFFREY BRANTINGHAM: It predicts sort of twice as much crime as any other existing system, even going head-to-head with a crime analyst.

KASTE: Older systems, like the famous CompStat in New York, show where crime has been. This looks forward.

BRANTINGHAM: The model will actually predict other locations that effectively say, even though there was a crime somewhere else in your environment, the risk is still greatest in this location today in the next 10 hours or the next 12 hours.

KASTE: Brantingham and his colleagues are now selling this predictive system to police departments. They call their product PredPol. And at this point, you're probably already thinking about the sci-fi movie "Minority Report." But this is different. There are no psychics sleeping in bathtubs, for one. But more to the point, this isn't about predicting the who of future crime, just the where.

SERGEANT CHRISTI ROBBIN: These red boxes are predictions of where the next crimes are likely to occur.

KASTE: Police Sergeant Christi Robbin pinches and zooms on a map of Seattle. Earlier this year, the city started using PredPol to predict property crimes. Now, it's expanding the system to predict gun violence too; the first place to do so. At the start of every shift, patrol cops are assigned to the boxes on the map.

ROBBIN: So we're asking that they spend the time in that 500-by-500-square-foot box, doing whatever proactive work they can to prevent that crime.

PHILIP MONZON: ...next block down and just - so this is the spot to park. We're going to park here.

KASTE: Officer Philip Monzon has just pulled up inside his box. Today, it's a city block near the Seattle waterfront.

MONZON: They want visibility. They want contacts with businesses as are appropriate, and anybody who's wandering through the area. So here we go.

KASTE: This area has a lot of parking lots, and PredPol's forecast includes car thefts. As Monzon passes a green Honda, he pauses. The guy inside seems to be ducking under the dashboard.

MONZON: I just want to make sure if he has a key or if he's going to pull out anytime soon.

KASTE: The car starts, the guy probably does have the key. But why didn't Officer Monzon challenge him, just in case?

MONZON: I don't really have enough - I'm not going to just single out one guy in a Honda.

KASTE: And here's where this gets tricky. The courts say police need reasonable suspicion in order to stop somebody. That suspicion can come from a lot of things, even someone's furtive movements, as the police like to say. But can it come from the fact that someone is occupying an imaginary red box drawn by a computer?

MONZON: No, no. I don't know. I wouldn't make a stop solely on that.

KASTE: That's probably the right answer, says Andrew Guthrie Ferguson. He's a law professor at the University of the District of Columbia, and he's taken a special interest in the constitutional implications of PredPol. He says the departments using it have told police not to use it as a basis for stops. But he wonders how long that can last.

ANDREW GUTHRIE FERGUSON: The idea that you wouldn't use something that is actually part of the officer's suspicion and not put that in may come to a head when that officer is testifying and either is going to have to omit a fact that really was the reason he stopped or she stopped the suspect or is something that they will then admit on the stand and then the issue will be raised for the court to address.

KASTE: And it may be that PredPol is a constitutional basis for stopping someone. Some might see it as more objective than a cop's judgment, less prone to racism or other kinds of profiling. Ferguson says that is possible, but we need to be careful.

FERGUSON: I think most people are going to defer to the black box, which means we need to focus on what's going into that black box, how accurate it is, and what transparency and accountability measures we add to it.

KASTE: In other words, even though computers aren't biased, the stats feeding them might be. And he says if we're going to follow an algorithm, we should at least be willing to check the math. Martin Kaste, NPR News, Seattle.

(SOUNDBITE OF MUSIC)

SIEGEL: This is ALL THINGS CONSIDERED from NPR News. Transcript provided by NPR, Copyright NPR.