Artificial Intelligence and voice contracts: Where are we now in Entertainment Law?

By Emma Chapple ·

Law360 Canada (October 10, 2024, 2:34 PM EDT) --
Emma Chapple
Emma Chapple
In the evolving battle of artificial intelligence (AI) and aggregated personality, characteristics and attributes of actors and famous personalities, the human voice has become a persistent battleground, one that is followed keenly by entertainment lawyers.

If you had to name James Earl Jones’s one defining attribute, you would say his voice. And for good reason, too. The actor, who passed away on Sept. 9, 2024, was the voice of Star Wars villain Darth Vader, The Lion King’s Mufasa and the branded, voice-over station identifier of CNN. He was literally “The Voice of CNN” for 34 years — his longest-running gig.

Before his death, Jones came to an agreement with The Walt Disney Company (which, in case you forgot, owns and controls both Star Wars and The Lion King) to allow the company to continue to use his voice recordings, likeness and vocal style for future appearances. This includes not only archival recordings by Jones himself but also vocal synthesis through artificial intelligence to generate new dialogue. A conscious and legal contract from which Jones’s heirs will continue to benefit.

It’s not clear if Jones’s agreement with CNN might also have contained a provision for what would happen with his prerecorded voiceovers after his death.

Using AI to recreate a distinctive voice, however, is still novel in entertainment law. AI was used to some controversy in the 2021 documentary Roadrunner, about the life of the late Anthony Bourdain, to create the impression that Bourdain himself was narrating part of the film.

In late September 2024, Meta announced that it is bringing the voices of veteran English actress Dame Judi Dench, actor and professional wrestler John Cena and comedian Keegan-Michael Key to its AI chatbot. They won’t be the real actors, though, but artificial intelligence chatbots, so voice recreation.

Wouldn’t it be great to have a conversation, almost, with Dame Judi Dench? You could pretend that you were getting instructions from “M” herself (a role based on the real-life MI5 director-general Stella Rimington from 1992 to 1996) because for whatever reason, James Bond was unavailable to save the world from the latest villain-du-jour. At the very least, it would make using Facebook’s AI chatbot more fun and interesting—even smarter! But here’s the point: Judi Dench’s voice is distinctive and instantly recognizable, something that AI can replicate but did not create.

At the other end of the spectrum, earlier this year, Scarlett Johansson voiced her objection (pun intended) to OpenAI’s use of an imitation of her voice for a new ChatGPT feature, a persona called “Sky.” In fact, Johansson claimed that OpenAI’s CEO, Sam Altman, had approached her for permission to use her voice for the feature, which she declined.

OpenAI was undoubtedly trying to cash in on Johansson’s film Her, a science-fiction romance. Many are now heralding the 2013 film as 10 years ahead of its time. The film follows Theodore Twombly (Joaquin Phoenix), a man longing for connection, who develops a relationship with Samantha (Scarlett Johansson), an artificially intelligent virtual assistant personified through a female voice but no physical body.

At one point, Samantha’s character reveals: “I’m yours, and I’m not yours,” a stark reminder that she serves as the voice of the operating system of thousands of other users aside from Theodore — and is dating 641 of them, too. Mic drop!

Nevertheless, OpenAI went ahead with Johansson’s “voice,” perhaps deciding that it’s better to ask for permission and beg for forgiveness later. The feature was later dropped by OpenAI following Johansson’s objection.

The controversy swirling around AI voice recreation creates new legal problems flagged by entertainment lawyers:

  • How can artists protect against unauthorized use, like in the case of Johansson?
  • Alternatively, how can those with no objection to the technology benefit from it and set ground rules for its use?

The recognized Canadian tort of ‘misappropriation of personality’

Not surprisingly, legal precedent in Canada is thin. But past cases regarding personality rights may shed some light on how the law may treat AI in the future.

Use of a person’s distinctive characteristics for commercial gain is a recognized tort in Canada. A distinctive characteristic can be a person’s name, voice or likeness. When a person’s distinctive characteristic is used for commercial gain without the person’s consent, that can give rise to an action for misappropriation of personality.

Canadian courts have recognized the tort of misappropriation of personality in a duology of Ontario cases from the 1970s: Krouse v. Chrysler Canada Ltd. et al., (1974), 1 O.R. (2d) 225, and Athans v. Canadian Adventure Camps Ltd. et al., 17 O.R. (2d) 425.

Meanwhile, some legal scholars may remember the infamous United States appeal case, Vanna White v. Samsung​​​​​​, case of 1992, where the Wheel of Fortune personality sued the electronics manufacturer over a funny advertisement that, somewhat presciently one might say, featured a robot version of her.

In mid-September, California Governor Gavin Newsom signed two bills into law that protect actors and performers from artificial intelligence replicas of their likeness or voice being used without their consent.

Such protections were at the forefront of labour negotiations during the month-long strike last year by SAG-AFTRA, or the Screen Actors Guild and the American Federation of Television and Radio Artists. What is certain? Misappropriation of personality lawsuits are likely to continue in Canada. A solid contract penned by an entertainment lawyer would give better protection to artists and creators.  

Emma Chapple practises at the confluence of Entertainment Law, Business Law and Civil/Commercial Litigation at Massey LLP.

 The opinions expressed are those of the author(s) and do not necessarily reflect the views of the author’s firm, its clients, LexisNexis Canada, Law360 Canada or any of its or their respective affiliates. This article is for general information purposes and is not intended to be and should not be taken as legal advice.

Interested in writing for us? To learn more about how you can add your voice to Law360 Canada, contact Analysis Editor Richard Skinulis at Richard.Skinulis@lexisnexis.ca or call 437-828-6772.

LexisNexis® Research Solutions