Avril Hasselfield |
The legal profession is not exempt from this question nor is the field of mediation in which I work. Although I see some practical uses for AI, such as research or analyzing data, I do not believe a computer can replace a human when it comes to resolving disputes between people, in general and specifically in a legal context.
When parties choose a mediator, they look for someone they can trust and respect. In the disputes I am asked to help resolve, it is not uncommon that at least one party has worked with me before or that I have been recommended by others who know me.
Since the beginning of my legal career when I was called to the Ontario bar in 2002, I have fostered the reputation that my mediation practice is based on. I started at a full-service law firm representing primarily plaintiff litigants and then moved in-house to a large insurance and financial services company. This has given me the perspective of both sides of litigated disputes. Throughout my entire career, I have endeavoured to be a knowledgeable, fair and effective advocate that people want to work with and whose opinion they value.
Most mediators are also litigators with years of practical experience in dispute resolution. An AI platform can “learn” from data but all it can do is manipulate the data it is given in a manner it is “taught” by its programmers. It has no emotional intelligence. It cannot share experiences, empathize with people or defuse heated situations.
There is an argument that the lack of emotional bias creates a more neutral view. I would argue those same traits make AI heartless and devoid of the human emotion that is part of every legal procedure. Humans are social by nature and mediation is a complex emotional interaction with human connection, analysis and compromise.
After over 20 years of practice, I have experienced and observed almost all the emotions that come with mediations. Both parties believe their position is sound and fair and there is often resentment, frustration or anger toward the other side. An AI platform cannot pick up on those nuances.
Emotional intelligence is the cornerstone of mediation. By reading people, human mediators can help defuse emotions and allow parties to focus on a solution that is acceptable to both sides.
Any AI platform is only as good as the data it draws on to reach a conclusion. Since all content and programs are created by humans, all AI is arguably flawed with the biases of its creators and data. As Mediate.com has noted, ChatGPT’s “responses may be based on gender, racial and myriad other biases of the internet and society. By changing the context, the statements can even be manipulated. Consequently, ChatGPT does not appear to be entirely independent after all.”
The responses from AI platforms are 100 per cent logic-based. The law is not. As the late U.S. Supreme Court Justice Oliver Wendell Holmes, Jr. famously observed, “The life of the law has not been logic: it has been experience.”
With mediation, the issues that need to be resolved between the parties may have nothing to do with the ultimate resolution of the lawsuit. Only humans can provide this analysis and support to make the final decisions they subjectively perceive as fair.
Ethics is a cornerstone of the legal profession and evolves based on social nuance and moral principles. At least at this stage in AI development, a computer platform does not have the ability to understand morality or the human condition. While ChatGPT might be used to write essays or populate legal briefs, it is a long way away (if ever) from replacing humans as mediators, since a cut-and-paste solution is not enough.
Is an AI platform a better mediator than a human? I asked ChatGPT itself, entering the question, “Will ChatGPT replace human mediators?”
I received this answer:
“ChatGPT and other AI language models like it have the potential to assist and enhance the work of human mediators in certain contexts, but they are unlikely to fully replace human mediators in many situations.”
Anything that changes the way we work creates an element of fear of the unknown. Any new process needs to be vetted to determine its value. I do not believe mediation, like law, will ever be a place where people will want to take the advice of a computer operating system over a human. While AI might be able to mimic humans it will never be able to replace them.
Avril Hasselfield is a partner at Results Mediators.
The opinions expressed are those of the author and do not reflect the views of the author’s firm, its clients, Law360 Canada, LexisNexis Canada, or any of its or their respective affiliates. This article is for general information purposes and is not intended to be and should not be taken as legal advice.
Photo credit / David Gyung ISTOCKPHOTO.COM
Interested in writing for us? To learn more about how you can add your voice to Law360 Canada contact Analysis Editor Peter Carter at Peter.Carter@lexisnexis.ca or call 647-776-6740.