Algorithms Have Potential To Reduce Sentencing Disparities

By Aaron Horowitz, Kristian Lum, Erica Marshall and Mikaela Meyer | September 23, 2022, 5:41 PM EDT ·

Aaron Horowitz
Aaron Horowitz
Kristian Lum
Kristian Lum
Erica Marshall
Erica Marshall
Mikaela Meyer
Mikaela Meyer
There is no shortage of evidence showing the stark racial disparities in federal sentencing. And yet, after decades of advocacy, organizing, policymaking and an incredible amount of research, sentencing disparities have not improved.

These alarming disparities persist in part because this clear evidence has not been used to make sentencing practices less harsh or more fair. This systemic failure is unjust — but not unsolvable.

Data has been used for years to shape practices and policies in our justice system. Algorithms are used in settings such as pretrial release,[1] bail determinations, sentencing[2] and parole supervision.[3] Most commonly, judges use them to make pretrial determinations[4] about whether to release or detain a person pending trial.

Algorithms have been commonly developed and used by criminal legal system actors to try to judge the risk posed by people accused of crimes. These tools analyze certain factors about an individual to inform whether they are released or held in custody, often for months, before ever being adjudicated as guilty.

For example, the Public Safety Assessment developed by Arnold Ventures for pretrial risk assessment uses information about the defendant — including age, criminal history, and record of failing to appear at hearings — and whether the current offense is considered to be violent as inputs to its model.

Each of the factors is weighted based on how closely related the factors are to the outcomes of new criminal activity and failure to appear at future court dates. The sum of these weights can be converted to a scaled six-point score for each outcome, and these scores are then used to recommend whether an individual should be released pretrial.[5]

Upon conviction, algorithms predicting recidivism can influence the sentence a person receives, and once incarcerated, these tools are often used to influence decisions about granting parole and levels of parole supervision.[6]

While system actors have used algorithms to judge the risk posed by defendants, these tools have not been used to analyze the risk that the criminal justice system poses to the defendants.

Using federal sentencing data published each year by the U.S. Sentencing Commission, our team of researchers and advocates developed an algorithm[7] to predict which defendants were likely to receive a disproportionately lengthy sentence based on factors that should be legally irrelevant at sentencing, such as race, jurisdiction and other factors outside of the defendant's control.

It performed similarly to commonly used pretrial risk assessments, achieved a "good" predictive accuracy according to standards[8] in the field, and was able to predict which defendants were more likely to receive a disproportionately lengthy sentence.

While it performed at a level accepted in our criminal justice system, the algorithm is not without the same inherent flaws that plague all risk assessment tools. To start, there is inherent bias in the underlying data available due to the discretion of criminal legal system actors, including police officers' choices of whom to arrest, and prosecutors' charging decisions — all of which have occurred before the judge imposes a sentence.

Second, it is important to note that the algorithm we developed, like other algorithmic tools, is undemocratic. We used our judgment to determine what kind of variables to include. Such tools must have effective public process[9] in their development if they are going to be adopted for broader use.

In spite of the shortcomings that plague all algorithmic tools, our algorithm shows that there is potential for a different way of using predictive modeling in the criminal legal system to analyze data to help system actors improve outcomes in the criminal justice system, rather than simply assess or score those involved in it.

Federal law requires that two people with criminal histories and similar offense conduct be treated similarly,[10] but our algorithm and other research show that our system is not meeting this requirement. However, available data could be used to develop a tool to help our system comply with federal rules. 

Since more than 90% of cases settle by plea, sentencing is often the only time that an attorney has an opportunity to advocate for their client. With a tool in hand, attorneys could notify judges about the heightened risk for a disproportionately long sentence based on the defendant's demographic information.

The defendant could also use such a tool to argue for a lower sentence consistent with those handed down to similarly situated defendants, as well as appeals of disproportionately long sentences.

There is potential for additional applications. The First Step Act was signed into law in 2018, enabling defendants to file motions directly with the court to seek sentencing reductions where extraordinary and compelling circumstances warrant a reduction.

Federal district courts across the country have since granted thousands of such motions where the defendant's personal history, underlying offense, original sentence, disparity created by any changes in the law, or sentencing factors at Title 18 of the U.S. Code, Section 3553, warrant such a reduction.

Algorithms like ours could allow defendants to show how disparities in their sentencing meet these criteria.

This kind of algorithmic model could also transform the presidential power of clemency, which is set forth in the U.S. Constitution in Article II, Section 2. Using data tools to create clemency lists could dramatically reduce bias and racial disparities in who is granted clemency, resulting in significantly more equitable outcomes.

We must make the choice to focus on improving outcomes in our criminal legal system rather than targeting the people who are a part of it. By using data and new tools, we could reduce sentencing disparities and unreasonable harshness in the federal system.

But if we are unwilling to use available data and tools to make our system more just, we should no longer use these resources to rank, score or perpetuate systemic bias and cruelty within the system.



Aaron Horowitz is head of analytics at the American Civil Liberties Union.

Kristian Lum worked on the study discussed here while she was an assistant research professor of computer and information science at the University of Pennsylvania.


Erica Marshall is executive director at the Idaho Justice Project.

Mikaela Meyer is a doctoral candidate in statistics and public policy at Carnegie Mellon University.

"Perspectives" is a regular feature written by guest authors on access to justice issues. To pitch article ideas, email expertanalysis@law360.com.


The opinions expressed are those of the author(s) and do not necessarily reflect the views of their employer, its clients, or Portfolio Media Inc., or any of its or their respective affiliates. This article is for general information purposes and is not intended to be and should not be taken as legal advice.

[1] https://pubmed.ncbi.nlm.nih.gov/31414840/.

[2] https://judicature.duke.edu/articles/assessing-risk-the-use-of-risk-assessment-in-sentencing/.

[3] https://onlinelibrary.wiley.com/doi/abs/10.1002/9781119184256.ch4.

[4] https://craftmediabucket.s3.amazonaws.com/uploads/PDFs/3-Predictive-Utility-Study.pdf.

[5] https://advancingpretrial.org/psa/factors/.

[6] https://journals.sagepub.com/doi/abs/10.1177/0093854808326545.

[7] https://www.aclu.org/news/prisoners-rights/what-if-algorithms-worked-for-accused-people-instead-of-against-them.

[8] https://csgjusticecenter.org/wp-content/uploads/2020/02/Risk-Assessment-Instruments-Validated-and-Implemented-in-Correctional-Settings-in-the-United-States.pdf.

[9] https://link.springer.com/article/10.1007/s13194-021-00437-7.

[10] 18 U.S.C. section 3335 mandates the "need to avoid unwarranted sentence disparities among defendants with similar records who have been found guilty of similar conduct."

Hello! I'm Law360's automated support bot.

How can I help you today?

For example, you can type:
  • I forgot my password
  • I took a free trial but didn't get a verification email
  • How do I sign up for a newsletter?
Ask a question!