UPDATED 23:49 EST / MAY 23 2016

NEWS

Judges use biased, racist software to determine re-offending risk

It came as a mild surprise to learn that some U.S. courts actually rely on software to predict the likelihood of convicted criminals re-offending when considering the sentences they hand out. But even more surprising is that the software they use appears to be racist towards black people.

ProPublica has just carried out a sweeping, thorough investigation into the software system made by a company called Northpointe Inc., which is used by the courts in Broward County, Florida, to help assess the risk of convicts re-offending. That system, which Northpointe designed with input from several non-profit and for-profit groups, takes into account a series of factors about a convict’s life and circumstances before assigning a “risk score” which indicates how likely they are to re-offend. That score is then taken into consideration by judges, police, and prison officers.

A convict’s “risk score” is similar to a credit score, only the consequences of having a low score can be far more serious – people’s liberty is at stake after all. Several states, including Arizona, Colorado, Delaware, Kentucky, Louisiana, Oklahoma, Virginia, Washington and Wisconsin currently use the software.

In its investigation, ProPublica studied the available data of more than 7,000 people who were arrested in Broward County, and came up with some alarming findings:

  1. The software isn’t very good at predicting who will re-offend
  2. Black people are much more likely to receive a high “risk score” but not re-offend
  3. White people are much more likely to receive a low “risk score” but re-offend

Indeed, ProPublica found that black defendants were flagged as potential re-offenders at twice the rate of white defendants.

ProPublica cites the example of an 18-year old black woman who was arrested for stealing a kid’s bicycle and scooter. The suspect had a light criminal record (a couple of misdemeanors as a juvenile), and the total value of the stolen items was just $80. In a second case, a 41-year old white man was arrested for stealing $86 worth of goods from Home Depot. The suspect had a previous conviction for armed robbery, and served five years in jail for that offense. Nonetheless, the software decided that the black female was more likely to commit an offense in future than the white male.

Not surprisingly the software was proven wrong. The black female was never charged with any new crimes, while the white male was later arrested and convicted of breaking into a warehouse and stealing thousands of dollars worth of electronic goods. He is now serving an eight-year stretch in jail.

ProPublica’s investigators concluded that the software is likely giving too much weight to factors such as social marginalization and wealth – things that correlate heavily with race.

Should software have a place in the courtroom?

These findings are extremely worrying, and call into question the very notion that software should be used by officials like judges and police to ascertain whether or not an individual poses a risk of committing crimes in the future. After all, in any court case, due process requires that the accused be able to confront and cross examine the witnesses against him. But how does one cross examine a closed-source, proprietary computer algorithm that judges are required to use?

No doubt, Northpointe has provided lots of guarantees about the accuracy of its software, and officials have taken them at their word. But Northpointe’s word alone wouldn’t be enough to secure a conviction in court, so why should it be trusted when it comes to evaluating someone’s risk of re-offending, especially when the judge takes into account that “risk factor” when sentencing a convict? One has to wonder how many people may have been given harsher sentences based on their apparent “high risk” of re-offending.

In response to the article, Northpointe said in a statement that it disputes ProPublica’s analysis saying that its claims do not “accurately reflect the outcomes from the application of the model”. Tellingly though, the company refused to provide details of how its algorithm works, or how it arrives at its risk scores.

ProPublica’s full report is well worth a read, full of jarring comparisons of the way it arrives at highly contrasting “risk scores” for black and white people.

Image credit: TBIT via pixabay.com

A message from John Furrier, co-founder of SiliconANGLE:

Support our mission to keep content open and free by engaging with theCUBE community. Join theCUBE’s Alumni Trust Network, where technology leaders connect, share intelligence and create opportunities.

  • 15M+ viewers of theCUBE videos, powering conversations across AI, cloud, cybersecurity and more
  • 11.4k+ theCUBE alumni — Connect with more than 11,400 tech and business leaders shaping the future through a unique trusted-based network.
About SiliconANGLE Media
SiliconANGLE Media is a recognized leader in digital media innovation, uniting breakthrough technology, strategic insights and real-time audience engagement. As the parent company of SiliconANGLE, theCUBE Network, theCUBE Research, CUBE365, theCUBE AI and theCUBE SuperStudios — with flagship locations in Silicon Valley and the New York Stock Exchange — SiliconANGLE Media operates at the intersection of media, technology and AI.

Founded by tech visionaries John Furrier and Dave Vellante, SiliconANGLE Media has built a dynamic ecosystem of industry-leading digital media brands that reach 15+ million elite tech professionals. Our new proprietary theCUBE AI Video Cloud is breaking ground in audience interaction, leveraging theCUBEai.com neural network to help technology companies make data-driven decisions and stay at the forefront of industry conversations.