Science

An Algorithm Is Being Tested in San Francisco to Set Pre-Trial Bail

It's a battle between tradition and technology.

Getty Images / Ian Waldie

The criminal justice system is in the midst of an algorithmic change. Around 30 jurisdictions — including the entire states of Arizona, Kentucky, and New Jersey, as well as cities like San Francisco and Chicago — have been testing an algorithm that sets the cost of bail based on countrywide criminal records data. Not all judges are necessarily ready to take the digital recommendations to heart.

The algorithm was created by the Houston-based Laura and John Arnold Foundation, and is called the Public Safety Assessment, or PSA. The goal of the algorithm is to take biases out of setting bail by using data from 1.5 million pretrial cases. In the past, however, algorithms have carried the same biases of the people who make them.

Nine factors go into the algorithm, per the foundation’s website:

  • Whether the current offense is violent
  • Whether the person has a pending charge at the time of arrest
  • Whether the person has a prior misdemeanor conviction
  • Whether the person has a prior felony conviction
  • Whether the person has a prior conviction for a violent crime
  • The person’s age at the time of arrest
  • Whether the person failed to appear at a pretrial hearing in the last two years
  • Whether the person failed to appear at a pretrial hearing more than two years ago
  • Whether the person has previously been sentenced to incarceration.

The algorithm doesn’t take into account race, gender, income, education, employment, or neighborhood. This, according to the foundation, makes PSA neutral.

Still, judges in San Francisco haven’t been consistently following the recommendations, the San Francisco Chronicle reports.

People who can't afford bail can wait for months before receiving a trial. 

Getty Images / John Moore

San Francisco moved to use the algorithm after the city was sued by a national civil rights group that claimed exorbitant bail hurt the poor more than the rich. Rich people who committed minor crimes were buying their way out of jail, while poor people who couldn’t afford excessive bail amounts were left in a holding cell until a trial could be scheduled.

The PSA was supposed to level the playing field by looking at data, rather than the immediate crime. The algorithm uses historical data to judge how likely it is that a person will commit another crime or avoid trial if put on bail, Minority Report style. If the likelihood is high, the bail is set higher, and vice versa.

A similar algorithm created by Northpointe was used to guide prisoner release dates. ProPublica published the results of an investigation in May that found that the “risk assessment” scores given by Northpointe’s algorithm disproportionately predicted that black people were more likely to commit another crime than white people after they got out. Just 20 percent of Northpointe’s predictions were accurate.

Advanced mapping technologies and big data, on the other hand, are also helping law enforcement identify and police criminal hotspots.

The foundation’s algorithm aims to avoid a similar bias by removing any demographic indicators. Whether or not it truly works, however, won’t be seen until judges actually start to rely on algorithms over precedent and intuition.

Related Tags