Racist Algorithms


Kia Andersen

The criminal justice system in North America has a multitude of issues that are not close to being solved. Any system that gives the rapist Brock Turner only 3 months in jail, but 16 years to a hacker who exposes rapists, must have something wrong with it. But up until the digital age, these decisions have been determined by people. While it is difficult to maintain an objective perspective when prosecuting, the perpetrators still need to live with the consequences of their actions. But what does it mean when an algorithm hands out these decisions?

ProPublica is an amazing non-profit organization that pursues investigative journalism in the public interest. They are currently investigating an algorithm used in Florida courts that gives a defendant a number from 1 to 10 that predicts their future “criminality”. This number is supposed to be used to determine which re-habitation or parole programs could be beneficial to the defendant. But that’s not what happens. What they found was that judges were using these numbers in sentencing. And even worse – ProPublica found that the algorithm had racial bias.

When the risk assessment scores were analyzed, ProPublica found “significant racial disparities”. The algorithm was twice as likely to falsely flag black defendants as “future criminals” than white defendants. ProPublica’s methodology was controlled for criminal history, age, and gender, but the for-profit company that developed the software, Northpointe, disputed the analysis. Basically it overestimates the future criminality of black people while underestimating future criminality for whites.

This is just another clear example of the institutionalized racism that is deeply embedded in the criminal justice system. Netflix’s popular documentary, “13th”, examines America’s transition from personal to institutionalized slavery in the prison economy. In America (and Canada), much of the criminal justice process has been privatized. This is a huge conflict of interest for what we call “justice”. People are making profiting from the mass incarceration of black bodies.

While the algorithm does not input race, it asks questions like “Was one of your parents ever sent to jail or prison?” and “How many of your friends/acquaintances are taking drugs illegally?”. When 1 in 3 black men (in the US) will go to prison in their lifetime, it is clear to see that the algorithm doesn’t need to directly ask about race to be racially biased.

The algorithm being used here has real life consequences. It is important to remember that technology is not neutral. Technology is a human by-product that inherits our biases. While the technological optimists might believe that an algorithm will eliminate bias, it clearly does not. The algorithm is racially biased.

This is deeply concerning on many levels. Although, yes, judges and law enforcement are also biased,they still need to look face to face with the ‘criminal’. Handing off this tough judgement call to an algorithm is making the process even less personal and less humane. With algorithms making more decisions for us, we need to evaluate where we should draw the line. Having an algorithm do the dirty work for us doesn’t make the process fairer – it just makes us feel better about it.

Source: https://www.propublica.org/article/machine-bias-risk-assessments-in-criminal-sentencing

http://cdn2.sbnation.com/assets/4020857/dyes.jpg

Advertisements

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s