Click to Skip Ad
Closing in...

Courts use a ‘Minority Report’ crime prediction algorithm, and it’s incredibly racist

Published May 23rd, 2016 4:55PM EDT
Court Risk Assessment Algorithm

If you buy through a BGR link, we may earn an affiliate commission, helping support our expert product labs.

I was somewhat surprised to learn that courts use software to predict the likelihood of criminals reoffending. But I was far less surprised to learn that the computer, much like the system it serves, seems to hate black people.

ProPublica has a new report that shines a light on the system used by Broward County, Florida. Those courts use a system made by Northpointe, a for-profit company. Various factors are inputted into an algorithm, which spits out a score that reflects an offender’s chance of re-offending within two years.

DON’T MISS: Right now, Lyft is cheaper than the subway in NYC

Those scores are then used by judges to help with everything from bond amounts to sentencing. It’s kind of like a credit score, only worse-informed, and used to make decisions about people’s liberty, rather than car insurance. It’s currently used in states including Arizona, Colorado, Delaware, Kentucky, Louisiana, Oklahoma, Virginia, Washington, and Wisconsin.

To measure the effectiveness of Northpointe’s algorithm in the real world, ProPublica obtained the risk scores of 7,000 people arrested in Broward County, and tracked them for the next two years.

Surprise result! The computer was “remarkably unreliable” in predicting violent crimes: only 20 percent of people predicted to commit violent crimes actually did so. That figure only rises to 61 percent when considering all crimes.

What makes the report — and yes, there is something worse than computers using flawed methodology to lock people up — is the racial bias. ProPublica found that it falsely flagged black defendants at twice the rate that it did white defendants.

On the flip side, white defendants were mistakenly labelled as “low risk” more often than black defendants.

Northpointe disputed the results of ProPublica’s findings, and wouldn’t release the exact algorithm it uses to compute risk scores. So, in conclusion, a computer is incorrectly classifying individuals as high or low risk, using a formula that it won’t disclose, but is objectively racist. And courts are still using the algorithm to influence judge’s decisions. Right.

Chris Mills
Chris Mills News Editor

Chris Mills has been a news editor and writer for over 15 years, starting at Future Publishing, Gawker Media, and then BGR. He studied at McGill University in Quebec, Canada.