The criminal justice system is an integral part of our society and has the power to influence many aspects of our lives, from where we can live to how much money we can make. But what if that power was used unfairly, based on bias? This is a growing concern as algorithms become increasingly used in the criminal justice system. A new study reveals that algorithms used to predict the likelihood of a defendant committing future crimes are biased, and this could lead to unfair decisions in criminal justice cases. The study found that the algorithms were more likely to overestimate the risk of African American defendants and underestimate the risk of White defendants. This could lead to unfair outcomes, such as longer sentences and increased surveillance, for defendants of color.

Read Full Article Here

source: Phys.org