Law schools disclose common concerns, diverse takes on regulating GenAI use by students, faculty

By Cristin Schmitz, Amanda Jerome, and Terry Davidson

Law360 Canada (September 29, 2023, 6:50 PM EDT) -- Law schools have stepped up efforts in the 2023-2024 academic year to help the next generation of lawyers understand and capitalize on ChatGPT’s emergence, while also teaching law students to critically evaluate – and competently and ethically use – generative artificial intelligence (GenAI) in the classroom and in their future law career.

GenAI’s potential downsides for academic integrity, for example, by facilitating “cheating” on law school tests and assignments or on personal statements supporting law school applications, have also been top of mind for law deans, a Law360 Canada national investigation finds.

One common thread in the responses to the 14 multi-pronged questions about ChatGPT and its GenAI competitors that Law360 Canada sent to Canada’s 24 law deans is that law schools are all working on how best to deal with the challenges and opportunities that have emerged since ChatGPT’s debut 10 months ago.

However, no law school professed to have all the answers to our queries, or even final answers, given the rapid development of GenAI. 

Notably, only some law schools have so far devised explicit specific guidance for the use of GenAI by law students, professors and staff.

And no law school espoused a cookie-cutter or “one-size-fits-all” model to regulating and teaching the use of GenAI in the classroom. Rather it seems that law professors are mostly being left to decide whether and how GenAI may be used by their students in their particular courses, although some law schools have given faculty members a number of specific dos and don’ts.

Queen's University Law dean Colleen Flood

Queen’s University Law dean Colleen Flood

Queen’s University law dean Colleen Flood articulated a view expressed by several law deans Law360 Canada interviewed.

“I think, in general, our response to this is ‘proceed with caution,’” Flood said.

“We see GenAI AI and AI tools as being transformative of law and the practice of law, and the subject-matter areas of law and of society generally,” she explained. “Our view is that we really have to equip our students with the capacities and abilities to work with AI tools, and to be able to meet all the challenges – ethical and others – that are going to come at them in practice,” she said. “If we don’t equip them, our students are going to be left behind.”

The responses to an identical set of in-depth queries Law360 Canada emailed this month to the deans of the two dozen law programs offered by the country’s 23 law schools included “no comments” from eight law schools, of which some said they were too busy at the beginning of academic year to meet our one-week deadline. The University of Saskatchewan, McGill University, Université Laval, and University of New Brunswick all politely declined comment. Others with no comment for the time being were: Thompson Rivers University (“this is under review. ... cannot currently comment”); University of Toronto (“we are in the process of discussing in the faculty and within the wider university”); University of Manitoba (“the approach ... is still in development regarding this topic”); and the University of Alberta (“still in the process of evaluating and examining the use of GenAI AI on campus. ... Provost’s Taskforce on AI and the Learning Environment ... will provide recommendations to university communities on how best to handle the opportunities and challenges of GenAI AI in post-secondary education”).

We did not receive replies from five law schools: University of British Columbia; University of Windsor; Toronto Metropolitan University; Université de Sherbrooke; and Université de Moncton.

However, we did get written responses to our questions — including some fulsome answers — from the leaderships of Osgoode Hall Law School of York University, Queen's University, University of Ottawa (Common Law), Western University and Université de Montréal. 

Law360 Canada reporters also spoke with seven deans or associate/vice deans at: the University of Victoria; University of Calgary; Lakehead University; University of Ottawa (Civil Law); Queens, Osgoode Hall, Université de Montréal and Dalhousie University – as well as with several law professors and groups in Quebec engaged with AI issues in law schools. (See also companion story “French-language law faculties grappling with new breed of GenAI AI tools”) 

Below is some of what we heard (condensed and abridged) from the Canadian law schools who responded to our queries:

University of Victoria, British Columbia

“We currently do not have a specific policy on AI or the use of ChatGPT,” Andrew Newcombe, the law faculty’s associate dean of academic and student relations told Law360 Canada. He explained professors are using their own approaches, based on the demands of their specific courses. This may include: a wholesale prohibition of GenAI use; a requirement that any GenAI use be disclosed; as well as teaching students to responsibly use, and critically assess, ChatGPT as a research tool.

Faculty discussions at the law school around GenAI have focused on aspects of academic integrity and professional ethics, including integrating it as a tool in some courses like legal research, and on what role ChatGPT and its GenAI competitors might play in the creation of the personal statements required to support applications to law school.

 University of Victoria Law School associate dean Andrew Newcombe

University of Victoria Law School associate dean Andrew Newcombe

Newcombe noted that, in recent years, more UVic law professors had been choosing to give students take-home and 48-hour online exams. Such evaluations are seen to provide a more accurate assessment of students’ learning and skills — as well as to more realistically reflect what will be expected of them in the workplace — than the traditional invigilated in-person exam.

“I think what we’re seeing this year is a few more professors kind of moving back to the traditional in-person exam,” Newcombe said. “But the majority of our exams remain online.”

He noted that most take-home exams are not knowledge- or information-focused, but instead require students to apply the law to nuanced and detailed fact patterns, which often have no clearcut answer. “I don’t think ChatGPT is creating really good answers to those types of complex problems at this point,” he remarked. But GenAI is just going to get better and better, he acknowledged. “We have to be alive to what’s happening.”

Newcombe said the broad academic integrity undertaking students must sign for take-home exams has been modified to expressly bar the use of AI.

Cheating, including on exams, has been rare at the law school, he said. “We highlight to the students that if there ever were an allegation of a breach of academic integrity, that would have to be disclosed” to the law society when they apply to join the bar.

The use of ChatGPT or other GenAI  to generate personal statements in support of law school applications is a subject the law school’s admissions committee will be discussing, he said.

University of Calgary, Alberta

“We’re certainly taking this very seriously,” said Dean Ian Holloway, who noted that the law faculty’s professional development day in August focused on how to use ChatGPT constructively and on various concerns about GenAI. “We didn’t come up with any definitive answers,” he disclosed, noting the development of AI “is still moving very, very quickly.”

Holloway said the law school doesn’t have a “single blanket” position specifically on GenAI-use. Different professors take different approaches, based on their courses’ pedagogical goals. However a general rule still applies that “all students should behave with propriety at all times.”

University of Calgary Law dean Ian Holloway

University of Calgary Law dean Ian Holloway

Detecting GenAI-facilitated cheating “is a bit of a mug’s game” because it can be hard to detect, Holloway said. Some professors are requiring students to state whether they have, or have not, made use of GenAI in course work. “In a way, this is forcing us to go back to the old-fashioned vision of honour,” Holloway remarked. “Whenever something new is introduced there’s a period of bumpiness and in few years time, we will all learn, more or less, how to use AI constructively in the classroom,” he predicted. “Mistakes will be made, but we’ll learn how to do it.”

Holloway was less sanguine about Chat-GPT ‘s impact on the law school admission process. He explained that GenAI presents a challenge for UCalgary and other law schools which take a “holistic” approach to law school admissions. Since they look at the “whole person,” rather than mostly the GPA and LSAT,  they put much more emphasis on applicants’ personal statements, he explained.

“The personal statement is so important,” Holloway emphasized, noting UCalgary law school rejects some applicants who are readily accepted by other law schools, simply because they are not seen to be the right fit for the law school.

“We spend thousands of person-hours every year on reading personal statements,” he advised. “And so the question is: How many of the personal statements we’re about to start receiving have been written by ChatGPT? And what do we do about that? We’re talking about it a lot in our admissions team but we just don’t know yet what the answer is.”

Lakehead University, Thunder Bay, Ontario

The law school expressly states personal statements in support of law school applications may not be AI-generated.

For AI use in the classroom, “we have developed some model language for syllabi, which ultimately preserves the ability of instructors to either permit the use of GenAI AI, prohibit it altogether or permit it for some purposes and not for others, or permit it for certain platforms and not others,” said Dean Jula Hughes. She commented that some professors are very savvy and engaged with GenAI while others are less so.  “We wanted to make sure that everybody has it on their radar and is aware of some of the challenges that can arise and that everybody is communicating really clearly with the students about what is permitted in their classes or what is not permitted.”

Lakehead University Law School dean Jula Hughes

Lakehead University Law School dean Jula Hughes

A question discussed at Lakehead was whether the existing academic offences cover the misuse of GenAI, Hughes said. “The opinion of the folks who run our Student Code of Conduct process here was that, in fact, yes, our language was broad enough to capture that. I think that’s a fair enough position to take in the initial round and, I think as the current academic year progresses, we’ll probably find out how that’s all going to unfold.”

Hughes said it remains to be seen how plagiarism is to be detected in the ChatGPT-era. “You can’t say, ‘Okay you plagiarized this thing because I can find it in the library and here’s what you copied,’” she remarked. “So at what point do we, even on a balance of probabilities, say ‘this is more likely the result of an AI tool than it is the result of student work?’”

Hughes said that her own sense, at least for now, is that AI tends to produce very affirmative and solid statements but is not that good at hedging. “As we know in law, most things are subject to caveats and conditions and an appreciation that the courts may take a different perspective, and all those kinds of things, so there’s a professional quality to legal writing that is really about clearly identifying the limits of the opinion,” she explained. “And I don’t know that AI knows how to do that yet.”

Hughes emphasized the importance of law schools teaching students to use AI tools ethically and competently, both at school and in legal practice. This includes AI that may be embedded in everyday software, such as Google Docs or Grammarly. “We have not taken the position that students cannot use those tools,” Hughes said. “But I think we are in the process of maybe learning more about whether there are boundaries around those uses as well.”

GenAI poses practical ethical questions in law school. “Particularly in times when there’s a lot of assignments due, when students are feeling the stress of law school, people may resort to AI tools just as a way of managing their workload,” Hughes said. “I think that offers a lot of opportunity for engaging students about what constitutes ethical conduct in law, . . .about how it’s really not the idea that you can choose to act ethically at times where it’s convenient and then choose not to act ethically at times where it’s inconvenient to do so. And that, I think, will be a bit of learning journey for all of us because it’s so temptingly easy.”

Hughes said she is exploring integrating into the curriculum teaching students how to think critically about AI use in law-related systems, such as the use of AI in decision-making and in articulating reasons for a decision. “Having some critical ability to understand the policy issues that arise when the bot makes the bail decision is very important for us,” she pointed out. “I think students need to have familiarity with ‘human in the loop’ requirements — thinking about how AI systems can both advance anti-discrimination efforts and also defeat them,” Hughes said. “So that critical capacity that we always need in the law is also going to be really important in the context of this evolving technology.”

Western University, London, Ontario

“Instructors are best positioned to determine the assessment practices and learning outcomes for their courses,” said Dean Erika Chamberlain. “At Western Law we enable individual instructors to determine if and how they will allow or prohibit the use of GenAI in the classroom. When considering GenAI content-creators like ChatGPT, Western encourages instructors and faculty members to talk to their students about the technology and to set clear expectations about its use in their course.”
Western’s pedagogical approach centres “on analysis and judgment” and the law school’s exams are conducted in person, she said.

Western University Law School dean Erika Chamberlain

Western University Law School dean Erika Chamberlain

Chamberlain said the law school knows its students will be using GenAI when they come to practice. “We recognize that it may be a useful aid for legal editing, research and drafting, but we also know that over-reliance on GenAI, without critical and independent thought, creates a risk of legal inaccuracy and potential breaches of a lawyer’s fiduciary obligations and the duty of competence.”

The dean said her school is actively exploring opportunities to use GenAI in the classroom and has provided learning and development opportunities for faculty. The school has a Law and Disruptive Technologies course and will provide more opportunities for students to learn about the practical applications of GenAI, she said.

Osgoode Hall Law School of York University, North York, Ontario

“At Osgoode, I think we’re certainly looking at AI both from a risk management, regulatory side of things, as well as an opportunity and innovation side of things,” explained Dean Trevor Farrow, who highlighted his law school’s “huge focus of attention” on AI.

Trevor Farrow

Osgoode Hall Law School dean Trevor Farrow

“Academic integrity matters, and we really care about it,” Farrow stressed. “At the same time, my view is that it’s really hard to put the toothpaste back in the bottle and I’m not sure we want to.”

“AI is here,” he said. “I think it's a revolutionary moment in society, generally, and I think it would be a huge lost opportunity to not embrace it, to explore it, to experiment, and to really think through what its power is in the context of legal education, and ultimately serving the public.”

Farrow said GenAI could expand people’s access to justice. “I do not see AI as replacing lawyers, but what I do see is AI assisting with the delivery of legal services, making some services more efficient, and pushing us to rethink where we deploy our resources, in terms of human capital, as we respond to modern legal needs,” he explained. “There is so much legal need out there, that we will never service it in the traditional way of per-hour lawyer services. There's way more demand than there is supply, and that's important to recognize and so if we can support the profession, empower lawyers to sort how best to meet that growing and huge demand, I think, AI has the potential to fill in some gaps that are never being served.”

At the same time, lawyers will still fill “the large space that is still needed with human judgment, human assistance, and the many things that lawyers provide in terms of legal counsel,” he added. “There’s still a huge demand for that kind of guidance and that kind of legal work. And I think that’s what AI is going to force us to embrace, and to think about, and to define in terms of that kind of work.”

Last February, York University Senate’s academic standards committee issued a special statement on the use of AI technology for academic work.

Osgoode communicated the AI policy to students and instructors, including adding a special notice on the cover sheet of all April 2023 exams and a mandatory subsection in all course outlines for the 2023-24 academic year. The law school also established a “Working Group on AI and Other Revolutionary Technologies” while the law school’s strategic plan for 2021-2025 specifies that a key goal is to “foster technical capabilities and critical reflection among law students, practitioners and academics regarding the use of technology in the legal profession.”

The university has issued guidelines on AI and academic integrity.

The law school said in a statement, however, that it provides “significant room for instructors to experiment with the use of AI with their students.”

Gen AI is permitted where specifically authorized by a given instructor for student use as an assisting tool – e.g. in research and paper writing. It is also permitted “through active introduction and use within the teaching of a course, including experimentation. Use by students is prohibited otherwise for academic honesty reasons,” as set out in York’s AI policy.

“Like any new technology, GenAI presents risks and opportunities,” Osgoode’s statement says. “Critical to our discussions is the balance between protecting academic integrity in all aspects of the academic mission (including evaluation), while at the same time experimenting with and promoting innovative opportunities for the use of AI to advance pedagogical objectives, enhance the learning experience and, overall, explore opportunities for law and legal practice to better service the public.”

Concerns and ethical issues raised by faculty members include research into how GenAI’s text- and data-mining potentially clash with copyright law, as well as arguments as to why ChatGPT and tools like it “frequently commit plagiarism,” the law school said. “The line between GenAI as an academic aid and GenAI as deskilling students is also under discussion.”

Detection of student cheating by using GenAI “is currently difficult because digital tools are mostly unreliable (far too many false positives) and/or compromise data privacy and security,” the law school said. “In our experience, the vast majority of law students are not interested in cheating, and do their best to comply with all academic rules and policies.”

Osgoode offers 16 JD law and technology courses, while AI is an “expanding” topic in some eight intellectual property courses and several clinical programs, as well as courses related to labour law and other areas.

University of Ottawa Faculty of Law (Common Law Section)

Kristen Boon, University of Ottawa Common Law Section dean

Kristen Boon, University of Ottawa Common Law Section dean

UOttawa Common Law Section dean Kristen Boon said the law school has been “the leader in the broad area of law and technology, including AI, for more than 25 years.” It is home to the Centre for Law, Technology and Society, the AI & Society Initiative, and faculty members who research AI.

The law school is offering several AI-focused courses in 2023/2024, such as Regulating AI, Privacy and AI, and AI and the Legal Profession.

The university has updated its academic integrity regulations to account for the use of AI. All undergraduates are required to complete mandatory training on academic integrity, including on the appropriate use of GenAI.

Using GenAI on exams is prohibited. “Students must sign an attestation confirming they did not use GenAI,” Boon said. “Violations are treated as academic fraud and are handled according to the University of Ottawa regulations on academic integrity. Additionally, exam software may be used to prevent access to AI tools during exams. For other types of assessments, such as assignments or papers, based on our guidelines, individual professors decide whether to prohibit GenAI AI or integrate it into the learning outcomes. If a prohibition is in place, students must sign an attestation confirming they did not use GenAI.Where GenAI is permitted, the student work must comply with the University of Ottawa Academic Regulations of Academic Integrity. In addition, students must provide a signed attestation detailing its use, purpose, and the prompts they engaged.”

The common law section has developed guidelines for using GenAI in teaching, Boon said. “Individual faculty members may decide to permit or prohibit the use of GenAI in the learning outcomes and assessments. A paper or an assignment should communicate a student’s original contribution to a selected topic. Therefore, an assignment fully or partially produced by GenAI is not permissible (even if a student discloses that they used GenAI AI to generate the text). However, there is nuance in how GenAI may be used and instructors are encouraged to consider which uses they find permissible, and which are not, considering the learning outcomes. When GenAI is prohibited, students are required to sign an attestation confirming they did not use GenAI. If Gen-AI is permitted, students are required to sign an attestation detailing the use of GenAI.”

Boon noted that the Ontario Law School Application Service, which governs the law school admission process in Ontario, prohibits the use of GenAI on admission essays. It has a working group looking at issues around GenAI and law school applications.

“We will monitor the situation and assess if any changes need to be made at the faculty level for next year,” she said.

Queen’s University, Kingston, Ontario

“Watchful learning is where we’re at,” dean Flood said.

“We’re leaving it to individual faculty — academic freedom is an issue here — so faculty can say in their particular course, ‘You’re not to use it.’ But our general stance is permissive,” dean Flood said. “So unless the faculty member says it’s verboten, the student can use it as long as they acknowledge, and cite, and source. We accept that we don’t know what we don’t know ... so we are planning on monitoring it this year and ... planning a faculty retreat next May and ... talk about how it has gone in individual classrooms, and try to do some analysis and assessment.”

Flood said “our view is that we really have to equip our students with the capacities and abilities to work with AI tools, and to be able to meet all the challenges, ethical and others, that are going to come at them in practice. So they both need to have the technical skills about how to use AI tools, but also be equipped to address the ethical challenges, for example, around algorithmic bias and these kinds of things.”

The successful students of the future, as professionals and leaders of society, will come equipped with the capacity to work with AI tools, as well as to understand them and their limits, including awareness of limitations from the perspective of vulnerable populations, Flood said.

Queens has adopted three “guiding principles” for AI use in the classroom. “Our default position is permissive in that students are permitted to use GenAI for learning purposes; however faculty may set limits to, and expectations for, the use of GenAI as they see fit for their courses.” There is a mandatory rule of attribution when it comes to GenAI “in the same way that we do for all contexts requiring academic integrity, i.e. text generated by AI cannot be presented as one’s own.”

In a written statement, Queen’s said courses which use written assignments, such as research papers, require more consideration by professors with respect to GenAI than the traditional invigilated exam.

“For example, should the use of GenAI be permitted for research purposes? If not permitted, should students be required to submit an attestation with their work? If permitted, how should it be cited? How should the importance of verifying the generated text and quality control be conveyed to the students? We have encouraged our faculty members to experiment with a view to sharing our collective experience at the faculty retreat in May 2024.”

Detecting cheating “is a complicated issue,” Queen’s said. “We do not yet have a reliable tool for detecting the use of GenAI. Therefore, we have emphasized educating our community members about the potential and risk of GenAI.”

Queen’s has made “strategic investments” to explore the pedagogical, research and practice implications and opportunities of emerging technologies, including a dedicated research group called the Conflict Analytics Lab, which is developing an AI platform specifically for legal professionals called OpenJustice, in collaboration with Harvard, Northwestern and global legal clinics.

As more Gen AI tools are developed by legally trained people, thereby improving the tools’ reliability, “I think our productivity will explode,” Flood suggested. “Hopefully it means better access to justice for more people. We can do more. Service more people. And still make a good living. So that’s kind of exciting, if it works out like that. Hope springs eternal in a techno-optimist breast.”

Flood added that law schools learning from each other “will be really, really important,” as well as “how will we equip students ethically to deal with this as part of their professional training? And that’s hard because it’s evolving and changing really quickly, and so what we would say today may well be quite different in a year or so.”

The ethical challenges will change as the validity and the accuracy of the tools improve, she remarked. “So we need to evolve and adapt to keep up with it, as law schools — which is not the easiest thing to do, to be honest. We definitely are trying, and I think if we work together, we can do better things and learn from each other.”

Dalhousie University, Halifax, Nova Scotia

“Current faculty are encouraged to consider how they might want to permit or prohibit students from using AI as a learning and/or assessment tool,” said dean Sarah Harding, who confirmed her law school has no official policy on GenAI use by students. “Right now, faculty are encouraged to be clear about what the rules of engagement are for the classroom; to be clear about their approach to students, [and] their rationale for that approach,” she advised.

Dalhousie University law school dean Sarah Harding

Dalhousie University law school dean Sarah Harding

Faculty members can draw from “guiding principles” to help them decide what is best for their courses, Harding said.

“Some … may see advantages to fully integrating AI into the learning and assessment process. Some may wish to permit students to use it as a learning tool, but require students to clearly produce work of equivalent academic value without the use of AI tools. Some may wish to be far more restrictive.”

Harding suggested “students may also be asked to participate in a discussion to refine that approach.”

With files from Luis Millán 

Photo of Colleen Flood: Bernard Clark


If you have any information, story ideas or news tips for Law360 Canada, please contact Cristin Schmitz at Cristin.schmitz@lexisnexis.ca  or call 613-820-2794.