Artificial intelligence has been used widely in the criminal justice system, with jurisdictions across the world utilizing AI to help with police investigations, analyze evidence, assist decision-making by judges and improve data analysis. Specific technologies include facial recognition technology (FRT) and biometric surveillance, social media analysis, licence plate readers, bail and sentencing algorithms, and the use of drones.
According to the LCO’s research, many police services and others believe FRT and related technologies are “game changers” that could transform the criminal justice system. At the same time, it is widely acknowledged that criminal AI systems raise profound risks of false arrest, mass surveillance, biased policing and reduced access to justice.

Law Commission of Ontario policy counsel Ryan Fritsch
“We know that police have used facial recognition systems and facial matching technology, where they use photo databases and images on file to try to identify people,” he said. “We also know that licence plate and other object recognition tools are now being used, and these are quite powerful when they are linked to other data like a person’s Wi-Fi trail across the city.”
Despite the concerns, there are still no laws governing the use of AI by law enforcement, prosecutors, courts or legal counsel in Ontario or Canada. But now the LCO has kicked off a multi-month public consultation process on AI in the criminal justice system that is aimed at eventually providing the provincial government with several recommendations for change.
The commission is starting off by releasing several policy papers posing a number of questions related to the use of AI in a criminal justice context — such as asking about the benefits and risks of these technologies, what issues could arise, what is the state of Canadian law and procedures to address these issues, and what proactive steps are needed to meet the challenges of AI in criminal justice.
Fritsch noted that Ontario passed legislation in 2024 to govern the use of AI in the public sector, but the law didn’t touch on courts or police use of AI.
“There’s a huge hole in the legislation, and so part of what we’re looking at is what are some of the regulatory options for Ontario,” he said. “And what’s unique about our project is that we’re not looking at one particular institution or one sort of specific use of AI or technology, but kind of thinking about AI as a systemic or life cycle challenge to criminal justice, that is police using AI technology.”
The LCO is looking to get input from a broad range of stakeholders, including lawyers and legal organizations, NGOs, industry representatives, academics, government and justice system leaders, and individual Ontarians interested in the operation of the criminal justice system. Individuals or organizations interested in working with the LCO are encouraged to contact Fritsch at rfritsch@lco-cdo.org.
The LCO is also accepting written submissions, which can be sent to the LCO’s general email address at LawCommission@lco-cdo.org. The deadline for written submissions is July 7, 2025.
Fritsch said the LCO is aiming to have final recommendations to government by late 2025 or early 2026.
If you have any information, story ideas or news tips for Law360 Canada, please contact Ian Burns at Ian.Burns@lexisnexis.ca or call 905-415-5906.