Algorithms, Race, and Reentry: A Review of Sandra G. Mayson’s Bias In, Bias Out
In true Minority Report fashion, state actors are increasingly relying on algorithms to assess the risk a person will commit a future crime. Unlike Minority Report, these algorithms simply estimate the likelihood of rearrests; they do not offer the absolute answer to future criminal behavior that condemned the defendant, Tom Cruise, in the 2002 action film. Still, criminal justice actors are using many types of algorithmic risk assessments to inform their decisions in pre-trial investigations, bail recommendations and decisions, and post-trial sentencing and parole proceedings. Sandra G. Mayson’s article[1], Bias In, Bias Out, published this year in the Yale Law Journal, explains how these algorithms could reflect and project past and present racial bias in the criminal justice system and elsewhere.
At its core, an algorithm specifies individual traits that are correlated with crime commission. If the data show that people of color are arrested more frequently, then the algorithm will predict more arrests for people of color. In this sense, an accurate algorithm “holds a mirror to the past” by “distilling patterns in past data and projecting them into the future.” Mayson provides an in-depth, yet easy-to-follow explanation of why race neutrality is unattainable when the base rates of arrest differ across racial groups. These mirror-like algorithms give us the opportunity to clearly view the racial disparity in arrests and convictions. Is there something wrong with this image, and what should we do now that we’ve seen it?