NYSBA Warns Of Risks To Confidentiality Posed By AI

This article has been saved to your Favorites!
Attorneys must ensure the use of artificial intelligence does not compromise the attorney-client privilege, advised a report from the New York State Bar Association on Monday, with additional recommendations provided for lawyers interested in exploring the burgeoning technology.

The report, delivered by the bar association's task force on artificial intelligence and approved by its governing body on Saturday, highlighted the increasing likelihood of data breaches affecting the legal industry broadly in its advice on AI.

Attorneys should advise clients when AI tools are being used in their cases, and legal staff such as paralegals need to ensure they handle AI tools properly, according to the report.

"We are aware of the enormous impact [artificial Intelligence] will have on our profession but are also familiar with the many risks it poses regarding confidentiality," Vivian Wesson, chair of the task force and executive vice president and general counsel at the Board of Pensions of the Presbyterian Church, said in a statement. "The technology is advancing at an alarming rate and so it is imperative that we address it at this time."

While the task force noted that the Rules of Professional Conduct in New York generally provide helpful guidance, it said more education is needed and recommended the association form a standing committee to address evolving issues related to ethical concerns, most notably with the technology's propensity to "hallucinate" case law.

The New York state court system's appellate division should at least consider rewriting the Rules of Professional Conduct, according to the report, to make it clear that attorneys should have the latest information on the technology and develop competency around the available tools.

"We have an obligation as attorneys to be aware of the potential consequences from its misuse that can endanger privacy and attorney-client privilege," said Richard Lewis, president of the New York State Bar Association. "I thank the task force for addressing this complex matter and providing direction on how we can incorporate it into our daily routines in a safe manner."

In regard to potential legal regulations, the task force ceded direction to legislators on whether they should be applied broadly across industries or tailored to the legal profession.

Since the emergence of popular generative AI tools like ChatGPT, the legal industry has sought to research and incorporate the technology — with both positive and negative outcomes — making it one of the areas of greatest concern for legal tech leaders moving into 2024.

Additionally, several small and large law firms, as well as legal tech companies and court systems, were hit with data breaches last year, and some cyberattacks resulted in litigation. For example, Orrick Herrington & Sutcliffe LLP was involved in two proposed class actions in California federal court over a March 2023 data breach, and Bryan Cave Leighton Paisner LLP was hit with two complaints in Illinois federal court over a February 2023 incident.

--Additional reporting by Sarah Martinson and Christina DeRosa. Editing by Linda Voorhis. 


For a reprint of this article, please contact reprints@law360.com.

×

Law360

Law360 Law360 UK Law360 Tax Authority Law360 Employment Authority Law360 Insurance Authority Law360 Real Estate Authority Law360 Healthcare Authority Law360 Bankruptcy Authority

Rankings

NEWLeaderboard Analytics Social Impact Leaders Prestige Leaders Pulse Leaderboard Women in Law Report Law360 400 Diversity Snapshot Rising Stars Summer Associates

National Sections

Modern Lawyer Courts Daily Litigation In-House Mid-Law Legal Tech Small Law Insights

Regional Sections

California Pulse Connecticut Pulse DC Pulse Delaware Pulse Florida Pulse Georgia Pulse New Jersey Pulse New York Pulse Pennsylvania Pulse Texas Pulse

Site Menu

Subscribe Advanced Search About Contact