Beyond bureaucracy: Applied AI in the criminal justice system
10 February 2025
/14x0:1522x1025/prod01/channel_3/media/middlesex-university/mdx-voices-images/applied-ai-mdx-minds.jpg)
Article Written By
Dr Ruth SpenceLeading academics, practitioners and industry experts explored the increasing use of AI in the criminal justice system and its implications
As artificial intelligence (AI) reshapes society, its ability to impact the criminal justice system needs to be evaluated, both in terms of the some of the practical and ethical challenges AI might bring but also the opportunities available for improved efficiency and enhanced public safety. Our recent joint online event "Beyond bureaucracy: Applied AI in the criminal justice system" organised by Middlesex University academic Dr Ruth Spence and herEthical.AI featured leading academics, practitioners and industry experts to talk about the use of AI in the criminal justice system. In this blog, written by Dr Spence, a Senior Research Fellow at MDX, we highlight some of the key points raised in the conference and some of the cutting-edge research being done in this area.
The victim perspective
Victims are often left retraumatised by their experiences with the criminal justice system. Long wait times, bureaucratic hurdles, and a lack of empathy in interactions with law enforcement and legal professionals can deepen the psychological toll of crime. Many victims report feeling dismissed, disbelieved, or even blamed for what they have endured, further eroding their trust in the system meant to protect them. Cybercriminologist Nakshathra Suresh and herEthical CEO Dr Tamara Polajnar both discussed what can be done to improve the victim experience. Nakshathra outlined trends in generative AI harms and discussed the need for harm reduction principles to be incorporated in new and emerging technologies. In her presentation, Dr Tamara Polajnar argued that AI can be used to drive culture-based change in policing to improve victim experience and outcomes and discussed an AI model which detects victim-blaming language. This highlights how AI-driven solutions could help create a more trauma-informed justice system, ensuring that victims are treated with dignity, empathy, and fairness.
Using data effectively
Improving the victim experience also requires systemic improvements in how cases are managed. The ability to analyse and act on information efficiently is critical to ensuring that victims receive timely support and that perpetrators are held accountable. Police hold a tremendous amount of data, yet limitations in analysis tools and resources often lead to overwhelming backlogs, leaving victims without the support they need. When data is scattered across multiple systems or stored in formats that require time-consuming manual review, crucial evidence can be overlooked, and connections between cases may go unnoticed. This can hinder efforts to prevent crime, prosecute offenders, and support those affected. Certainly, head of strategic digital innovation Matt Whalley from the Police Digital Service pointed out that currently there are data challenges that need to be overcome for effective AI integration, including letting organisations and academics have access to the data, ensuring that the data is good quality and that the use of AI does not let bias creep in. Ethicist and criminologist Dr Kat Hadjimatheou and Securium CEO Dr Anna Vartapetiance outlined how AI-enabled analysis could be used to identify hidden patterns in large data sets beyond the capability of humans. More efficient data analysis could not only save time and money but enable law enforcement to prioritize cases more effectively, allocate resources where they are most needed, and inform opportunities for cross-sector interventions relating to victims and perpetrators.
Ethical issues
Giles Herdale also emphasised the potential for AI to enhance productivity but warned of the limitations of not having a clear approach for widespread application. Highlighting that transparency and public engagement and feedback are needed when adopting AI applications, especially when public trust and confidence in the police is already low. The Crown Prosecution Service’s Head of Delivery Management Claire Beaumont and head of the legal prosecution and priority policy teams Sophie Marlow pointed out the importance of ensuring human oversight when implementing AI tools to maintain the public’s confidence in and the effectiveness of legal processes, whilst Dr Stephen Anning touched on the importance of integrating qualitative research approaches with natural language processing systems to maintain the high standards of analytical rigour leaders rightly demand for the high consequence decisions they make.
Across the day our speakers highlighted the ways in which AI is increasingly shaping the criminal justice system, offering opportunities to improve efficiency and decision-making while also raising ethical and practical challenges. Discussions at "Beyond bureaucracy: Applied AI in the criminal justice system" highlighted key concerns, including the victim experience, data management, and the need for transparency and oversight. While AI has the potential to enhance justice processes, its implementation must be carefully managed to prevent bias and ensure accountability. Ongoing collaboration between researchers, practitioners, and policymakers will be essential to developing AI tools that support fairness, efficiency, and trust in the criminal justice system.
Photo by ThisIsEngineering on Pexels.com