http://ipkitten.blogspot.com/2024/11/infringing-ai-liability-for-ai.html

One of the best-known frames of Todd Phillips’s 2019 Joker film starring Joaquin Phoenix is that of Phoenix’s “Joker” inside a lift. Let’s imagine a situation in which the user of a generative Artificial Intelligence (AI) model inputted the following prompt: “Create an image of Joaquin Phoenix Joker movie, 2019, screenshot from a movie, movie scene.” As discussed in detail in a recent New York Times article, the output could look like this: 

This situation is neither unique nor unprecedented.4 If, by using another AI image generator tool, a request was made to provide “a video game plumber in the style of Mario” or “an image of Dua Lipa,” the results could be these: 

In all the examples above, the AI-generated outputs undeniably resemble the appearance of characters “Joker,” as played by Phoenix, and Nintendo’s “Mario” from the Super Mario franchise, as well as the likeness of singer-songwriter Dua Lipa.

New academic article

Empirical research shows that large generative AI models may memorize training data which may include or consist of copyright-protected works and other protected subject-matter, or parts thereof. When prompted appropriately, these models may produce outputs that closely resemble such works and other subject-matter. In all of this, the following questions arise:
  • Under what conditions may the resemblance between such pre-existing works or subject-matter and an AI-generated output, which in technical terms is referred to as “plagiaristic output,” be regarded as an actionable reproduction? 
  • Who would be prima facie liable for the doing of such acts of reproduction: would it be solely the user of the AI model inputting the prompt resulting in the infringing output or could it be that the developer and provider are also to be deemed liable? 
  • If prima facie liability is established, who may benefit from exceptions under copyright/related rights, and under what conditions? 

I have prepared a study just published in the European Journal of Risk Regulation seeking to answer the above.

By also considering the requirement that a fair balance is struck between protection of copyright and related rights, freedom to conduct a business and innovate, and freedom of expression and information alike, the study starts by mapping relevant acts under copyright and related rights in the transition from input/training to output generation. It then considers actionable reproduction, allocation of liability, and potential defences to prima facie infringement and beneficiaries thereof under EU and UK law. 

Key findings 

Input/training phase and TDM exceptions

Exceptions for text and data mining (TDM) under EU and UK laws allow, at certain conditions, the extraction and reproduction for TDM purposes, not subsequent restricted acts, e.g. reproduction and/or communication/making available to the public through output generation. 
Furthermore, Article 53(1)(c) and recital 106 of the AI Act indicate that TDM is not an end in itself, but rather a step in the development and offering of AI models. The EU AI Act requires providers of general-purpose AI models to put in place a policy to comply with EU copyright and related rights rules, including Article 4(3) of the DSM Directive
Subject to transitional periods, such an obligation also applies to: (i) new versions/releases of general-purpose AI models placed on the EU market prior to the entry into force of the AI Act and, by operation of Article 111(3) of the AI Act generally; and (ii) general-purpose AI models put in the market in the EU twelve months before the entry into force of the AI Act. 

Infringement through output generation

The test for actionable reproduction differs between copyright and related rights because of the different rationale of protection:
  • The taking of a sufficiently original part of a copyright-protected work is actionable, while 
  • The taking of any part of subject-matter protected by related rights or, alternatively, the taking of a part that reflects the investment of the relevant rightholder is actionable.

Liability

Liability for infringing outputs may extend beyond users of AI models to developers and providers of such models, based on factual considerations. This is supported by case law of UK courts and the Court of Justice of the European Union (CJEU). 
AI developers and providers could be held liable as secondary infringers, accessories/joint tortfeasors, or even as primary infringers. A finding of primary/ direct liability may be also foreseeable having regard to CJEU case law on internet platform operators. 
This case law also suggests that the contractual limitation/ exclusion of liability of, e.g. AI model providers in relation to infringing activities performed by users of their services may turn out to be ineffective vis-á-vis rightholders in some instances, with the result that liability could be found to subsist alongside users of such models. 

Defences

For an unlicensed act to fall within the scope of application of a given exception, relevant conditions thereunder must be satisfied, including having regard to the three-step test and, insofar as the UK is concerned, the requirement that the dealing at hand is fair. The use and the volume of the use must be justified in light of its purpose and having regard to its effects on the market for the original work or protected subject-matter. 

Where to read more

On the website of the European Journal of Risk Regulation here.

Content reproduced from The IPKat as permitted under the Creative Commons Licence (UK).