Click to Skip Ad
Closing in...
  1. Best Robot Mop 2021
    08:29 Deals

    The world’s first self-cleaning robot mop is $100 off at Amazon – and I’m obsessed

  2. Best Meat Thermometer 2021
    09:31 Deals

    The gadget that helps you cook perfect steak is $33 at Amazon, a new all-time low

  3. MacBook Pro 2021 Price
    12:16 Deals

    Apple’s M1 MacBook Pro is $200 off at Amazon, matching the lowest price ever

Courts use a ‘Minority Report’ crime prediction algorithm, and it’s incredibly racist

May 23rd, 2016 at 4:55 PM
Court Risk Assessment Algorithm

I was somewhat surprised to learn that courts use software to predict the likelihood of criminals reoffending. But I was far less surprised to learn that the computer, much like the system it serves, seems to hate black people.

ProPublica has a new report that shines a light on the system used by Broward County, Florida. Those courts use a system made by Northpointe, a for-profit company. Various factors are inputted into an algorithm, which spits out a score that reflects an offender’s chance of re-offending within two years.

DON’T MISS: Right now, Lyft is cheaper than the subway in NYC

Those scores are then used by judges to help with everything from bond amounts to sentencing. It’s kind of like a credit score, only worse-informed, and used to make decisions about people’s liberty, rather than car insurance. It’s currently used in states including Arizona, Colorado, Delaware, Kentucky, Louisiana, Oklahoma, Virginia, Washington, and Wisconsin.

To measure the effectiveness of Northpointe’s algorithm in the real world, ProPublica obtained the risk scores of 7,000 people arrested in Broward County, and tracked them for the next two years.

Surprise result! The computer was “remarkably unreliable” in predicting violent crimes: only 20 percent of people predicted to commit violent crimes actually did so. That figure only rises to 61 percent when considering all crimes.

What makes the report — and yes, there is something worse than computers using flawed methodology to lock people up — is the racial bias. ProPublica found that it falsely flagged black defendants at twice the rate that it did white defendants.

On the flip side, white defendants were mistakenly labelled as “low risk” more often than black defendants.

Northpointe disputed the results of ProPublica’s findings, and wouldn’t release the exact algorithm it uses to compute risk scores. So, in conclusion, a computer is incorrectly classifying individuals as high or low risk, using a formula that it won’t disclose, but is objectively racist. And courts are still using the algorithm to influence judge’s decisions. Right.

Popular News