Criminal Legal Algorithms, Technology, and Expertise (CLATE)

Criminal Legal AI | 2024-today

Carceral algorithms encompass the broad category of algorithmic, automated, and data-driven practices employed in the criminal legal system. They can be as simple as a checklist or involve deep learning and complex statistical models, but their influence extends beyond technical capacity. While often introduced as part of an “objectivity campaign” that positions the technology as more impartial, objective, and scientific than human decision-making, in practice these algorithms rely on human decision-makers in ways that can create tensions in established regulatory structures, reinforce or obfuscate existing biases, and expand the scope of carceral systems.

The Criminal Legal Algorithms, Technology, and Expertise (CLATE) project investigates how introducing such tools destabilizes work practices, legal frameworks, and the legitimacy of expert authority. Drawing on a combination of interviews, legal analysis, and quantitative data, we explore how algorithms challenge decision-making processes in policing, prosecution, and how expertise gets wielded. We compare how these dynamics unfold across international contexts and different technological interventions such as probabilistic DNA profiling, facial recognition technology, risk assessment instruments, and predictive policing.

Project Directors: Hannah Pullen-Blasnik and Julien Larregue