top of page

THE MINORITY REPORT: DOES THE USE OF RISK ASSESSMENT ALGORITHMS UNFAIRLY BIAS MINORITIES IN REPORTIN


The use of algorithms to predict the likelihood that an offender will commit a future crime is used in various jurisdictions within the United States’ criminal justice system.[1] In fact, judges rely on risk assessments to inform decisions concerning the rehabilitation and sentencing needs of the incarcerated. However, risk assessment algorithms are often a mystery, as the factors included in the assessment and the weight given to each factor is considered proprietary to the private companies that provide the assessments.[2] Still, the overreliance on an individual’s risk assessment score may implicate due process when sentencing individuals based on factors that are not clearly identifiable or necessarily reviewable.


In fact, the issue is contentious in Wisconsin, where the Wisconsin Supreme Court recently sanctioned the use of a risk assessment algorithm concerning the likelihood that a defendant would commit another crime and return to incarceration.[3]


In State of Wisconsin v. Loomis, the court explored the use of risk assessment algorithms in making sentencing decisions and acknowledged that sentencing courts must observe specific practices to avoid due process violations. Specifically, sentencing courts using specific risk assessment tools must indicate and observe the following cautions:


1. That the proprietary nature of the assessment tool has been invoked to prevent disclosure of information relating to how factors are weighed or how risk scores are to be determined;


2. Risk assessment compares defendants to a national sample, but no cross-validation study for a Wisconsin population has yet to be completed;


3. Some studies of the assessment tool have raised questions about whether they disproportionately

classify minority offenders as having a higher risk of recidivism; and


4. Risk assessment tools must be constantly monitored and re-normed for accuracy due to changing

populations and subpopulations.


The court goes further and states that, “providing information to sentencing courts on the limitations and cautions attendant with the use of risk assessments will enable courts to better access the accuracy of the assessment and the appropriate weight to be given to the risk score.”[4]


The Wisconsin court touched upon a common concern that these assessment tools may unfairly bias black and brown offenders. In fact, one study shows that the algorithm used in the risk assessment of Broward County arrestees was particularly likely to falsely flag black defendants as future criminals, wrongly labeling them this way almost twice the rate of white defendants.[5] Thus, the importance of transparency and accountability when relying on algorithms becomes clear.


Under due process principles, a question arises as to whether some court’s use and over-reliance on risk assessment tools is an erroneous exercise of the court’s discretion in sentencing decisions. It’s imperative that courts heed the warnings of using risk assessment tools, that they should not be used to determine the severity of a sentence or whether an individual should be incarcerated. Rather, they are merely one additional consideration as to the rehabilitation needs of the offender, once released back into the general population.[6]


Read more here:


THE REWIRE: 

 

The Rewire is where technology and law merge to deliver a glimpse into the world of tomorrow. It is a place to find how the most recent technological advances are shaping our ever-evolving regulatory landscape. Technology is continuously changing how we interact with the world and The Rewire offers a unique perspective that shows the dance between innovation, the legal system, and commerce. The Rewire ensures to keep its readers entertained and among the sharpest and most informed at any gathering. 

 RECENT POSTS: 
 FOLLOW THE REWIRE: 
  • Facebook B&W
  • Twitter B&W
  • Instagram B&W
Future EVENTS:
The Future of Money & technology summit
December 4, 2017, San Francisco, CA.
 
RE*WORK Deep Learning & AI Assistance
January 25-26, 2018, San Francisco, CA.
 
TechCrunch's Disrupt SF
September 5-7, 2018, San Francisco, CA.
bottom of page