

It came as a mild surprise to learn that some U.S. courts actually rely on software to predict the likelihood of convicted criminals re-offending when considering the sentences they hand out. But even more surprising is that the software they use appears to be racist towards black people.
ProPublica has just carried out a sweeping, thorough investigation into the software system made by a company called Northpointe Inc., which is used by the courts in Broward County, Florida, to help assess the risk of convicts re-offending. That system, which Northpointe designed with input from several non-profit and for-profit groups, takes into account a series of factors about a convict’s life and circumstances before assigning a “risk score” which indicates how likely they are to re-offend. That score is then taken into consideration by judges, police, and prison officers.
A convict’s “risk score” is similar to a credit score, only the consequences of having a low score can be far more serious – people’s liberty is at stake after all. Several states, including Arizona, Colorado, Delaware, Kentucky, Louisiana, Oklahoma, Virginia, Washington and Wisconsin currently use the software.
In its investigation, ProPublica studied the available data of more than 7,000 people who were arrested in Broward County, and came up with some alarming findings:
Indeed, ProPublica found that black defendants were flagged as potential re-offenders at twice the rate of white defendants.
ProPublica cites the example of an 18-year old black woman who was arrested for stealing a kid’s bicycle and scooter. The suspect had a light criminal record (a couple of misdemeanors as a juvenile), and the total value of the stolen items was just $80. In a second case, a 41-year old white man was arrested for stealing $86 worth of goods from Home Depot. The suspect had a previous conviction for armed robbery, and served five years in jail for that offense. Nonetheless, the software decided that the black female was more likely to commit an offense in future than the white male.
Not surprisingly the software was proven wrong. The black female was never charged with any new crimes, while the white male was later arrested and convicted of breaking into a warehouse and stealing thousands of dollars worth of electronic goods. He is now serving an eight-year stretch in jail.
ProPublica’s investigators concluded that the software is likely giving too much weight to factors such as social marginalization and wealth – things that correlate heavily with race.
These findings are extremely worrying, and call into question the very notion that software should be used by officials like judges and police to ascertain whether or not an individual poses a risk of committing crimes in the future. After all, in any court case, due process requires that the accused be able to confront and cross examine the witnesses against him. But how does one cross examine a closed-source, proprietary computer algorithm that judges are required to use?
No doubt, Northpointe has provided lots of guarantees about the accuracy of its software, and officials have taken them at their word. But Northpointe’s word alone wouldn’t be enough to secure a conviction in court, so why should it be trusted when it comes to evaluating someone’s risk of re-offending, especially when the judge takes into account that “risk factor” when sentencing a convict? One has to wonder how many people may have been given harsher sentences based on their apparent “high risk” of re-offending.
In response to the article, Northpointe said in a statement that it disputes ProPublica’s analysis saying that its claims do not “accurately reflect the outcomes from the application of the model”. Tellingly though, the company refused to provide details of how its algorithm works, or how it arrives at its risk scores.
ProPublica’s full report is well worth a read, full of jarring comparisons of the way it arrives at highly contrasting “risk scores” for black and white people.
THANK YOU