"Nothing Is More Opaque Than Absolute Transparency"

The Use of Prior History to Guide Sentencing

Published in: Harvard Data Science Review, Volume 2, Number 1 (January 2020)

by Shawn D. Bushway

Read More

Access further information on this document at hdsr.mitpress.mit.edu

This article was published outside of RAND. The full text of the article can be found at the link above.

Rudin, Wang, and Coker (2020, henceforth RWC) present a convincing argument against black box algorithms like COMPAS that are sometimes used in the United States to help judges sentence convicted offenders in court. They point out that the lack of transparency means that defendants (and victims) cannot assess the accuracy of the score driving the decision, and researchers cannot accurately assess the fairness of any given decision rule. They then argue that these black box algorithms should be replaced with simple and transparent risk assessment algorithms that are based primarily on age and criminal history. Prior work has shown that these tools perform at least as well as the more costly proprietary tools.

Although the article is focused narrowly on proprietary risk tools like COMPAS, RWC's argument potentially has a much broader scope. As they discuss in their final section, the RWC arguments that black box algorithms are not fair applies not only to COMPAS, but also to all discretionary sentencing done by judges. RWC state:

Interestingly, a system that relies only on judges—and does not use machine learning at all—has similar disadvantages to COMPAS; the thought processes of judges is (like COMPAS) a black box that provides inconsistent error-prone decisions. Removing COMPAS from the criminal justice system, without a transparent alternative, would still leave us with a black box.

Prior work has shown not only that judges are 'black boxes.' but that they also not very good at identifying high-risk offenders (Gottfredson, 1999). An extension of the RWC argument then, if I might be allowed to take the argument to an extreme that RWC did not advocate, would replace all black box algorithms, including judges, with a simple risk tool that uses age and criminal history to assign sentences.

Research conducted by

This report is part of the RAND Corporation External publication series. Many RAND studies are published in peer-reviewed scholarly journals, as chapters in commercial books, or as documents published by other organizations.

Our mission to help improve policy and decisionmaking through research and analysis is enabled through our core values of quality and objectivity and our unwavering commitment to the highest level of integrity and ethical behavior. To help ensure our research and analysis are rigorous, objective, and nonpartisan, we subject our research publications to a robust and exacting quality-assurance process; avoid both the appearance and reality of financial and other conflicts of interest through staff training, project screening, and a policy of mandatory disclosure; and pursue transparency in our research engagements through our commitment to the open publication of our research findings and recommendations, disclosure of the source of funding of published research, and policies to ensure intellectual independence. For more information, visit www.rand.org/about/research-integrity.

The RAND Corporation is a nonprofit institution that helps improve policy and decisionmaking through research and analysis. RAND's publications do not necessarily reflect the opinions of its research clients and sponsors.