Avon and Somerset Police made headlines early this year when its chief announced that using AI could improve the quality of rape investigations.
In 2024, Chief Constable Sarah Crew approved a pilot of Söze, an Australian-developed software that enables detectives to identify links between suspects and track their movements, as well as searching through phone calls, messages and documents.
Presumably, this could include using the tool for disclosure purposes.
This is the duty of the police and prosecution to reveal material to the defence that may undermine the prosecution case or assist the defence in order to ensure a fair trial by preventing convictions based on incomplete or one-sided evidence.
It includes evidence gathered during an investigation that the prosecution does not intend to use at trial, such as messages, CCTV, social media material or unused witness statements.
On the face of it, it’s an area ripe for the use of AI given the increasingly vast amounts of digital data which have to be analysed and evaluated. The largest investigation currently on the Serious Fraud Office’s books, for example, has 48 million documents.
During the six-week Söze pilot, investigators completed complex data structuring and analysis in just over a day – work that would have taken an estimated 81 years to carry out manually.
The force has yet to announce whether it will purchase the software. But according to research published by Oxford University’s Institute of Technology and Justice, AI is already being deployed at every stage of the UK criminal justice process by at least one stakeholder.
In September 2025, a Scottish judge marked a legal milestone when he publicly acknowledged using AI to summarise documents for his decision on a disclosure application in a tax case – while emphasising that the legal analysis and responsibility for the ruling remained his alone.
Meanwhile, a government commissioned review of disclosure in fraud cases has warned that if current practices were not adapted to meet the challenge posed by the mountain of digital data, the Crown’s ability to investigate and prosecute criminal cases would face being overwhelmed.
It recognised that AI technology could play a valuable role in managing disclosure in the digital age, but stressed that its use must be transparent, properly governed and consistent with existing disclosure obligations.
The report also warned against relying on automated tools without clear oversight, certification and the ability for the defence to understand and, where necessary, challenge how material has been identified and reviewed.
Shortcuts with consequences?
While AI may be able to execute tasks at lightning speed, it can also be like taking a short cut through a minefield. You may reach your destination far faster – but also risk being blown to smithereens on the way.
The now former West Midlands Chief Constable learned this the hard way after it emerged that the risk analysis his force used to justify banning Israeli football fans from a match in Birmingham was based on an AI hallucination.
The mistake cost Craig Guildford his job, raising serious questions about the potential pitfalls of AI within the justice system – not least for those facing criminal prosecution.
In the context of disclosure, the use of AI raises several key concerns
- Missed disclosure – AI may fail to identify relevant material, especially if it is poorly trained or configured.
- Lack of transparency – it can be difficult for the defence to understand how an AI system has searched, prioritised or excluded material.
- Over-reliance on automation – investigators may trust AI outputs too readily, reducing proper human review and judgment.
- Bias and data quality issues – AI reflects the data it is trained on; flawed or incomplete data can skew results.
- Legal challenge and appeals – errors in AI-assisted disclosure may lead to collapsed trials, appeals or abuse-of-process arguments.
- Accountability gaps – uncertainty over who is responsible when an AI-assisted decision proves wrong.
Defence expertise in AI-assisted investigations
- Lewis Nedas has decades of experience challenging failures in police and prosecution disclosure across the full spectrum of serious and complex crime, including sexual offence allegations. Where AI tools are used to assist in reviewing or prioritising material, our lawyers are well placed to scrutinise whether statutory disclosure duties have been properly discharged and whether relevant material may have been overlooked or excluded.
- As police forces explore automated tools to sift large volumes of digital material — including phone downloads, social media content and messaging data — Lewis Nedas provides robust defence representation in cases involving technically complex evidence. We ensure that automated processes do not replace proper human judgment and that disclosure decisions remain transparent, accountable and legally defensible.
Lewis Nedas has been a top tier firm in the Legal 500 rankings for the past 14 years and has also featured consistently in The Times Best Law Firms list since its launch in 2019.
Fees
We are committed to providing clear and transparent information about our legal fees. We do not operate on a ‘no win, no fee’ basis, but instead offer tailored fees that reflect the complexity and requirements of each case.
Our fee structure comprises our competitive hourly rates and in certain cases we may be able to seek litigation funding. Fees are discussed and agreed with clients at the outset to ensure clarity and avoid unexpected costs.
Need advice?
Contact one of our specialist criminal lawyers for clear, confidential guidance.  Call us on 020 7387 2032 or complete our online enquiry form.
Siobhain Egan, Director (Non-Executive)