Holding Redlich's Sarah Butler discusses the legal recourses available to artists impacted by the growing use of generative AI
Officially, 120 days have been marked off the calendar of the strike initiated by the Writers Guild of America (WGA) against the Alliance of Motion Picture and Television Producers (AMPTP) in the US. The movement, which has since gone far beyond Tinseltown, has highlighted a need for legal reform in the area of copyright and IP law, especially as it relates to the arts.
The WGA officially set up the picket line on 2 May, with the Screen Actors Guild and the American Federation of Television and Radio Artists (SAG-AFTRA) kicking off its own strike on 14 July. According to Holding Redlich special counsel Sarah Butler, at the core of these strikes is the battle to protect artists’ IP against the growing use of generative AI models in the creative industry.
“The key focus of the current strikes by writers and actors in the US is a concern that their intellectual property is going to be undermined and infringed by generative AI”, Butler told Australasian Lawyer. “In particular, writers are concerned that generative AI will be utilised to produce scripts with little human involvement, and actors are concerned that their likeness, voice and image may be reproduced without their permission”.
She pointed out that to some extent, current copyright laws do protect writers’ work and actor performances from being reproduced by generative AI without consent; however, the novelty of this technology has birthed novel issues that will necessitate legislative reform in order for affected creatives to have proper legal recourses.
“Generally in Australia, a work will be protected by copyright if there is a human author who contributed ‘independent intellectual effort’,” Butler explained. “Accordingly, it is possible that in some circumstances works generated by AI will not have enough human input to be protected by copyright.”
This, she said, may mean that a creator will not be protected by the Copyright Act “against a third party who reproduces or modifies their work created using generative AI”.
What legal protections do artists in Australia have when AI is used to infringe on their IP without consent?
Earlier this month, US author Jane Friedman made headlines when it was discovered that there were books being sold on Amazon under her name – but that she hadn’t written. Friedman said in a post on her website that the fake books looked like they were AI-generated, and that when she initially raised the issue to Amazon, the company dismissed her claim because she did not have a trademark for her name.
Friedman is not the only author to have faced such a concern. And in Australia, court precedent dictates that “the general style and technique of an artist is not protected by copyright”, Butler said, pointing to the case of Cummins v Vella  FCAFC 218.
“Australian copyright law provides protection for artists against both the output product of AI impersonating works and the data sets used to produce such works. Where the work ultimately produced by an AI resembles other original work created by a human author, the ordinary protections of the Copyright Act may apply”, she explained. However, this merely provides recourse in circumstances where “AIs produce works substantially similar to a copyright work”.
Nonetheless, in the event that “copyright infringement can be shown in the data set used to produce a non-infringing work”, Butler indicated that copyright law can still provide a measure of legal protection.
“Generative AIs are trained on data sets which themselves contain original copyright works of the human author. At multiple points in a model’s training, copies of these original works are made, giving rise to potential causes of action for copyright infringement”, she explained. “The use of original works as training data may also infringe authors’ moral rights to the integrity of a work and its protection against being mutilated”.
There are also other legal avenues that creators can seek out in the event that their IP or image has been co-opted by AI without their permission. For instance, Butler pointed out that certain protections under the Australian Consumer Law that forbid misleading or deceptive conduct may apply if a celebrity’s image is used to suggest to the public that the person is endorsing a certain product.
The common law action of passing off may also protect a creator in circumstances where “one person has wrongly represented that its goods or services are related to those of another by imitating the ‘get-up’, or look and feel of a product or service”. Nonetheless, she acknowledged that “because subsistence of a reputation is required to successfully establish passing off, this law is of limited use for most people”.
Another recourse open to artists is defamation, if the unauthorised use of an image or IP does harm to an artist’s reputation. Finally, the Australian Privacy Principles may apply when personal or sensitive information was used in AI training data.