Why San Francisco courtrooms are turning to computer algorithms for advice

Jan 26, 2017

 

If you can’t afford bail in this country, you get stuck in jail until your trial. Many have said it’s a system that's biased against the poor.

 

In fact, an organization called Equal Justice Under Law recently filed a lawsuit, saying that San Francisco’s bail system is unconstitutional. Many city officials agree. Last fall the city attorney decided not to defend the lawsuit. Instead, the city is looking at some unconventional ideas for modernizing the city’s bail system, like turning to a computer program for advice on whom to release, and whom to keep behind bars. But can computers be better judges than judges?

 

Inside Judge Moody’s courtroom

Judge Ross Moody takes his seat in front of a San Francisco courtroom for an afternoon of bail hearings.  First up is a man charged with carrying a loaded gun in public. His defense attorney makes the case for releasing him before trial. He says the defendant has a job and a good track record of showing up in court.

The prosecutor disagrees, arguing that the defendant only showed up in court because he was already in custody, and that he’s clearly a threat to the public.

The judge has three options.

He can either detain the defendant until trial, let the defendant go free until then, or ask that he pay a big chunk of cash for his freedom.

That’s called setting bail. And in this case, the judge sets bail at $150,000.

Two systems

And that, in a nutshell, is California’s bail system at work. If the defendant can afford bail, he gets out. If not, he’s stuck.

The system is intended to make sure suspects show up in court, and to protect the public from potentially dangerous criminals. But critics like District Attorney George Gascón say that using money as a tool for achieving these goals doesn’t make sense.

“It really creates two systems: it creates a system for the people that can afford to post bail, and a system for those that cannot,” Gascón says.

Turning to data

So Gascón went searching for a way to decide which suspects should be released that didn’t involve money. He turned to data.

Last summer, San Francisco started experimenting with a computer program, or algorithm, to help judges decide who can be released without bail. The algorithm bases its recommendations on the case histories of more than 1.5 million people, taking into account the person’s age, criminal history, and record of showing up in court.

“You have a system that has developed a body of knowledge that is way superior to what a human being will ever be able to do,” Gascón says.

For example, if the suspect has only ever been convicted of shoplifting and has always shown up in court, the algorithm is likely to recommend release.

If the suspect has a record of violent assault against police officers or has always missed his court dates, the algorithm is more likely to recommend the suspect stay in custody.

The judge then takes these recommendations into account when making his bail decision.

Questions of bias

One thing the algorithm doesn’t consider: race. And that’s why San Francisco Deputy Public Defender Chesa Boudin supports this experiment. He says that even experienced judges may still have their own implicit biases.

“They may see a person who looks a particular way, have their hair done a particular way, who behaves a particular way in court, and that might shape their decision about that person's liberty,” Boudin says.

But Boudin worries that the algorithm may also be biased. For example, San Francisco’s algorithm, designed by a nonprofit called the Laura and John Arnold Foundation, weighs a defendant's criminal history.

Boudin explains, “Someone who has been stopped by the police, numerous times, because of the color of their skin….in other words the exact people the Arnold Foundation seeks to protect with this new tool are prejudiced against based on the built in legacy of prejudice and racism in our criminal justice system.”

The judge calls the shots

However, we’re not turning bail decisions over to computers just yet. After crunching all the data, the algorithm spits out a recommendation. But the final call is still up to the judge.

Judge Moody’s courtroom is a good example of discretion at play. In the case above, where Judge Moody set bail at $150,000 for a man carrying a loaded firearm in public, he first got a recommendation from the new algorithm, or as it’s officially called, the public safety assessment.

The algorithm actually recommended letting the man go free before trial. But Judge Moody disagreed, saying the man is a threat to public safety.

Situations where the judge overrules the algorithm’s recommendation is not uncommon. The algorithm recommends release two-thirds of the time, but Deputy Public Defender Chesa Boudin says judges only have been releasing defendants around one-third of the time. For people who believe the bail system keeps too many people in jail unnecessarily, those numbers are disappointing.

“The thing judges are most afraid of, understandably….is that if they let someone out, or set a low bail, [and] then that person goes on to do something tragic, or horrific,” Boudin says. “We cannot decide something like individual liberty based on a worst case, hypothetical scenario.”

Man vs. machine

So, do we trust the humans or the computers?

“I think there’s certainly room for discretion, and there should be room for discretion,” says Sharad Goel, an Assistant Professor at Stanford University in the Department of Management Science and Engineering.

He and his researchers found that risk assessment tools are more likely to accurately predict risk than people. But, he says, humans can see nuances that computers can’t.

“When a defendant comes to court, they expect their individual story will be heard, and a decision will be based on that,” Goel says.  

More broadly, Goel says that though algorithms can be helpful guides to fair decisions, they can’t completely fix disparities in the criminal justice system.

“There’s strong evidence that algorithms are better than humans at making many of these consequential decisions,” Goel says. “My worry is that people will strictly use algorithms and not consider broader policy fixes.”

Looking beyond algorithms

One such fix would be to eliminate money bail entirely. California lawmakers recently unveiled a bill to do just that.

“I think the money bail system will be gone, we've already seen entire states that have gotten rid of money bail in its entirety,” Gascón says.

New Jersey is one of the states moving away from money bail. To make sure people show up in court, some suspects there are required to wear electronic bracelets, take drug tests, or enroll in programs as a condition for their release. At least six other states have passed similar legislation to expand those services.

A solution straight from the doctor’s office

In the meantime, Gascón has another futuristic technology that might just be the answer for getting people to show up in court: text messages.

“Automatic messages that say, ‘Hey George, you have court on Monday, court on Wednesday,’” Gascón says. “Much like if you have a doctor’s appointment, most of us now generally get a call the day before.”

If it gets people to show up to the doctor, it might just work in the courtroom.