Secret algorithms that predict future criminals get a thumbs up from Wisconsin Supreme Court

Latest

There’s software used across the country that predicts whether people are likely to commit a crime. It’s not quite Minority Report, but the same basic idea is behind it: The software assesses various data points about a person and then gives him or her a risk score; the higher the score, the more likely they are to commit a crime in the future. The scores are used by judges in a number of different jurisdictions for sentencing people convicted of crimes.

Back in May ProPublica published an investigation into the risk-assessment software that found that the algorithms were racially biased. ProPublica looked at the scores given to white people and black people and then whether the predictions were correct (by looking at whether they actually committed or didn’t commit crimes); they found that in Broward County, Florida, which was using software from a company called Northpointe, black people were mislabeled with high scores and that white people were more likely to be mislabeled with low scores.

This is obviously problematic, as a possible outcome is that judges will give longer sentences to black people based on an erroneous computer assessment of their risk. And that’s something a defendant named Eric Loomis seriously objects to. Northpointe’s software, called COMPAS was used in his case in Wisconsin. All of these companies say the formulas used to come up with the scores are proprietary so defendants can’t find out why they were deemed low or high risk. Loomis decided to appeal, saying that the use of secret algorithms in the criminal justice system violates his right to due process.

In 2013, Loomis was arrested for allegedly being the driver in a drive-by shooting. He pled guilty to lesser charges of fleeing the police and driving a stolen car. After his plea was entered, the court had a Presentence Investigation Report (PSI) conducted, which included a risk score from COMPAS (which is short for Correctional Offender Management Profiling for Alternative Sanctions). Loomis, who was a registered sex offender, was deemed a high risk for committing another crime and got a six-year sentence.

Last week, the Supreme Court of Wisconsin issued an opinion in his case: It rejected Loomis’s request to be sentenced again, and said the lower court which sentenced him didn’t violate his due process rights by using risk-assessment software because it didn’t rely on the risk score alone.

But the opinion comes with some interesting caveats about things judges need to keep in mind when using risk scores in sentencing decisions: The two most important factors they’re asked to keep in mind is that software has been found to be racially biased and that the software needs to be constantly monitored and updated with new information. (If you’re relying on data from five or ten years ago, it’s not going to be accurate.)

What the opinion ignores is that algorithms like this weren’t actually originally intended to be used for sentencing. As ProPublica explained earlier this year, when risk-assessment tools were first rolled out in 1989, they were intended for the corrections industry generally—probation and parole officers—to let them know who they needed to pay the most attention to. It wasn’t intended as a score that determined how much time you should spend in prison. Yet in Wisconsin, COMPAS has been widely adopted; according to ProPublica it’s “used at each step in the prison system, from sentencing to parole.”

The Wisconsin Supreme Court doesn’t care that the software is considered a proprietary trade secret, because the program’s “Practitioner Guide” includes some of the types of data that are part of the assessment, including how many times the person being assessed has been arrested and how many times they’ve been charged with a new crime while on probation. In other words, what’s relevant according to the court is knowing what goes in, not how it’s weighted.

The judges of the Wisconsin Supreme Court, though, did appear to read the ProPublica article (which they briefly cite) and advises judges using risk-scoring programs going forward to keep in mind that “some studies of COMPAS risk assessment scores have raised questions about whether they disproportionately classify minority offenders as having a higher risk of recidivism.”

The queasiness that judges feel about algorithmic risk-assesment is reflected in the concurring opinion filed by Justice Patience Drake Roggensack. “Reliance would violate due process protections,” she writes. “Accordingly, I write to clarify our holding in the majority opinion: consideration of COMPAS is permissible; reliance on COMPAS for the sentence imposed is not permissible.”

The third concurring judge, Justice Shirley Abrahamson, is similarly concerned. In her concurring opinion she says that she believes that any time a court considers risk-scores in sentencing, it “must set forth on the record a meaningful process of reasoning addressing the relevance, strengths, and weaknesses of the risk assessment tool.” Abrahamson also expresses concern that judges simply don’t know enough about tech (not a new complaint, even among appellate judges) to understand how programs like COMPAS work without significant help from the companies that make those programs.

In the end, this all means that Loomis will be going to prison. But it’s interesting to see judges, even as they affirm that risk-assessment programs can be used, express a lot of concern about how they can be used accurately. And given the concerns Abrahamson raises about whether judges can even understand the tech involved, and ProPublica‘s reporting on the bias built into one of the two softwares that are commonly used, one has to wonder whether consulting with technologies like COMPAS can be helpful without being costly.

Ethan Chiel is a reporter for Fusion, writing mostly about the internet and technology. You can (and should) email him at [email protected]

0 Comments
Inline Feedbacks
View all comments
Share Tweet Submit Pin