Report: Software Used to Predict Future Crime Shows Racial Bias

Risk-assessment scores, which are becoming ever more common in courtrooms across the nation, are often used to make important decisions regarding a defendantโ€™s freedom or the terms of the defendantโ€™s freedom. Suggested Reading The Ever-Growing List of Lawsuits Against Sean ‘Diddy’ Combs Take a Look Inside Michael Jordan’s Former Chicago-Area Mansion, Which You Can Now…

Risk-assessment scores, which are becoming ever more common in courtrooms across the nation, are often used to make important decisions regarding a defendantโ€™s freedom or the terms of the defendantโ€™s freedom.

Video will return here when scrolled back into view
Trump’s Tariffs Might Stick Around. What Should We Buy Now?
Trump’s Tariffs Might Stick Around. What Should We Buy Now?

However, according to ProPublica, the scoresโ€”which are based on an algorithm whose creators claim can predict defendantsโ€™ likelihood of committing another crimeโ€”seem to show a clear racial bias against black defendants. They were shown to often, wrongly, rate black defendants as future criminals, while white defendants were mislabeled low-risk more often than their black counterparts.

According to the report, then-U.S. Attorney General Eric Holder expressed caution about the risk scores in 2014 because of such possible bias.

โ€œAlthough these measures were crafted with the best of intentions, I am concerned that they inadvertently undermine our efforts to ensure individualized and equal justice,โ€ he said. โ€œThey may exacerbate unwarranted and unjust disparities that are already far too common in our criminal-justice system and in our society.โ€

At the time, he called on the U.S. Sentencing Commission to study the use of these scores, which, according to ProPublica, the commission did not. And so the nonprofit investigative-journalism organization took it upon itself to study risk scores obtained from more than 7,000 people arrested in Broward County, Fla., in 2013 and 2014 to see how many were charged with new crimes over the next two years. This benchmark is the same used by the creators of the algorithm.

Among its findings, ProPublica noted that only 23.5 percent of whites were labeled higher risk but did not commit any further crime, while 44.9 percent of African Americans were labeled higher risk and yet did not reoffend. On the other side of the coin, some 47.7 percent of white offenders were labeled lower risk but but did commit another crime, whereas only 28 percent of African Americans were labeled lower risk but went on to commit another offense.

Overall, ProPublica found the score highly unreliable when forecasting violent crime, with only 20 percent of the people predicted to go on to commit these type of crimes actually going on to do so.

When all crimesโ€”including simple misdemeanors such as driving with an expired licenseโ€”were taken into consideration, the algorithmโ€™s accuracy increased to โ€œsomewhat more accurate than a coin flip,โ€ ProPublica states, with 61 percent of those labeled as likely to reoffend doing so within two years.

When ProPublica controlled for defendantsโ€™ prior crimes or the types of crimes they were arrested for, as well as the age and gender of each defendant, the racial disparity still crept in, with black defendants 77 percent more likely to be tagged as being at a higher risk of committing a future violent crime, and 45 percent labeled more likely to commit a future crime of any kind.

Read the full report at ProPublica.

Straight From The Root

Sign up for our free daily newsletter.