Getty Images Sues AI Artwork Generator Alleging Massive Copyright Infringement

Getty Images sues Stability AI copyright infringement

Photo Credit: Note Thanun

Getty Images says it has started legal proceedings against AI artwork generator Stable Diffusion.

The lawsuit was filed this week in the UK in London’s High Court and alleges “Stability AI unlawfully copied and processed millions of images protected by copyright.” Getty Images says it believes artificial intelligence has the potential to help stimulate creative endeavors. “Accordingly, Getty Images provided licenses to leading technology innovators for purposes related to training artificial intelligence systems in a manner that respects personal and intellectual property rights.”

“Stability AI did not seek any such license from Getty Images and instead, we believe, chose to ignore viable licensing options and long-standing legal protections in pursuit of their stand-alone commercial interests,” the statement from Getty Images concludes. 

AI artwork generated by tools like Stable Diffusion do create original pieces of art. Humans give the AI ‘prompts’ that are nothing more than a collection of words. Here’s a quick example of how it works.

An example of an AI art prompt may be something like, “A Gothic castle perched on a hill with a thunderstorm and lighting in the sky as mermaids dance in the foreground.” For the AI to create that prompt, it must have been seeded thousands of images of castles, thunderstorms, hills, mermaids, and people dancing to even understand the image it has been tasked with creating. 

Companies collect and collate images and labels themselves in order to train their AI. There are massive datasets with millions of entries to help the AI know exactly what a ‘mermaid’ is understood to be by a human. Sometimes these companies turn to dataset organizations like LAION, which distributes the data it collects freely. “LAION datasets are simply indexes to the internet,” its FAQ reads

Stability AI is open that it uses LAION-generated datasets as seed material. Other AI tools are more ambiguous as to the source of their millions of seed images. Stability AI says it plans to allow artists to ‘opt-out’ of appearing in its image training software–but shouldn’t this kind of thing be opt-in for creatives?