Culture

Study: Algorithm That's "Biased Against Blacks" Behind Poor Inmate Recidivism Predictions

Big data doesn't have the blind eye of the law.

Getty

Algorithms are just as biased as the people who make them. An investigation by ProPublica found that Northpointe, creators of an algorithm that attempts to predict an incarcerated person’s likelihood to commit another crime, predicts black people are much more likely to commit a crime after getting out than white people.

The algorithm creates a “risk assessment” score for inmates. It is used across the country to guide prison release dates. After analyzing data for more than 7,000 people in Broward County, Florida, the reality of racist algorithms is clear: black defendants were almost twice as likely than white defendants to get a prediction of committing another crime.

The analysis controlled for variables like gender and criminal background. Still, black people were predicted as 77 percent more likely to commit a violent crime in the future and 45 percent likely to commit any future crime. Also troubling is that only 20 percent of Northpointe’s predictions on who will commit future violent crimes are accurate.

Northpointe's data doesn't judge equally.

Getty

Northpointe uses 137 questions to judge the potential of people committing a crime. As ProPublica reports, the questions avoid asking about race outright. Some of the questions may be related, however.

Examples of the questions include: “Was one of your parents ever sent to jail or prison?” and “How often did you get in fights while at school?” There are also yes/no questions such as “A hungry person has a right to steal.” Other factors include education levels and whether or not a person has a job.

ProPublica detailed the exact methods of how they accessed and analyzed Northpointe’s data in an explanatory post, and Northpointe sent a letter to ProPublica stating that the company “does not agree that the results of [their] analysis, or the claims being made based upon that analysis, are correct or that they accurately reflect the outcomes from the application of the model.”

ProPublica’s results, if not incredibly surprising, are concerning. Risk assessments are used for figuring parole and deciding bond amounts; they even factor into criminal sentencing in nine states.

Technology is not the end-all solution to every problem, especially considering the technology is just as flawed as the people who make it.

Related Tags