Skip to Content
Top

Risk-Assessment Algorithms: Bringing Racial Bias into the 21st Century

meeting
|

In July of this year, the Pennsylvania Sentencing Commission provided judges in the state with an algorithm-based risk-assessment tool that they will use to make sentencing determinations. This program aims to calculate the likelihood of an individual reoffending by looking at factors such as their age, gender, and criminal history.

The Keystone State is not the first jurisdiction to have implemented a risk-assessment algorithm - and it is unlikely to be the last. Similar systems are already in use in multiple states, including New York, California, and Florida, and many more are in the process of developing or rolling out their own algorithms.

At first glance, the introduction of a program that will provide judges with assistance and guidance during sentencing may seem like a good thing. It should help to keep high-risk offenders off the street and allow low-risk individuals to avoid spending too long in prison, right?

Unfortunately, historical analysis and scientific testing have repeatedly shown that these risk-assessment algorithms do not consistently achieve this desired outcome. Instead, they have a tendency to inaccurately and disproportionately flag certain groups (usually African Americans and other minorities) as high-risk offenders.

A review into the risk-assessment algorithm used in Broward County, Florida by ProPublica found that 44.9% of African Americans that were flagged as high-risk did not re-offend. For white defendants, this figure was 23.5%.

So, why do algorithms like the one in Broward County and the one due to be introduced in Pennsylvania provide such inaccurate and racially-biased results? The problem can be largely attributed to the weight that these systems place on an individual's criminal history and juvenile records.

According to the Department of Justice, African Americans are stopped by the police (both in traffic and on the street) much more frequently than their white counterparts. Thus, they are more likely to have a criminal record and be flagged as a high-risk offender when they are evaluated by the algorithm.

Until this issue can be addressed, risk-assessment algorithms will continue to be biased and inaccurate - traits that we should not accept in our criminal justice system.

The attorneys here at Shapiro Zwanetz & Lake have been helping the people of Maryland fight back against the inadequacies of the criminal justice system for years. If you would like to have us defend you against your criminal charges, just give us a call at (410) 927-5137 to set up an initial consultation.

Categories: