artificial intelligence

Getty Images sues Stable Diffusion creators Stability AI for scraping its content

Victor Barreiro Jr.
Getty Images sues Stable Diffusion creators Stability AI for scraping its content
Getty Images says Stability AI 'unlawfully copied and processed millions of images protected by copyright'

MANILA, Philippines – Getty Images announced in a press statement on Tuesday, January 17, it was suing Stability AI, the makers of artificial intelligence-powered art tool Stable Diffusion, for infringing on its copyright by scraping images and processing them for use with Stable Diffusion.

The Getty Images statement said it had “commenced legal proceedings in the High Court of Justice in London against Stability AI claiming Stability AI infringed intellectual property rights including copyright in content owned or represented by Getty Images.”

Getty Images asserted that “Stability AI unlawfully copied and processed millions of images protected by copyright and the associated metadata owned or represented by Getty Images absent a license to benefit Stability AI’s commercial interests and to the detriment of the content creators.”

The Verge, in its report, added that Getty’s CEO, Craig Peters, told them it had given Stability AI a “letter before action” – or a formal notification of impending litigation in the United Kingdom. It is unclear if the litigation will also be done in the United States.

“We don’t believe this specific deployment of Stability’s commercial offering is covered by fair dealing in the UK or fair use in the US,” Peters said. “The company made no outreach to Getty Images to utilize our or our contributors’ material so we’re taking an action to protect our and our contributors’ intellectual property rights.”

A representative for Stability AI, Angela Pontarolo, meanwhile told The Verge, “Stability AI team has not received information about this lawsuit, so we cannot comment.”

The full details of Getty Images’ lawsuit aren’t public yet, but Peters said the charges include copyright violation and violation of the site’s terms of service, which in this case would mean web scraping.

A 2022 analysis of the Stable Diffusion dataset by an independent party noticed images from Getty, as well from other stock image sites, made up some of the contents of the dataset.

The analysis from August 2022 noted, “Unsurprisingly, a large number came from stock image sites. 123RF was the biggest with 497k, 171k images came from Adobe Stock’s CDN at ftcdn.net, 117k from PhotoShelter, 35k images from Dreamstime, 23k from iStockPhoto, 22k from Depositphotos, 22k from Unsplash, 15k from Getty Images, 10k from VectorStock, and 10k from Shutterstock, among many others.”

Additionally, Getty Images’ presence is visible, as the AI software’s can sometimes recreate the Getty watermark when it makes new renderings.

Getty Images appears to be looking towards creating a new legal ruling in terms of AI art tools and the licensing of images. Getty has said it “provided licenses to leading technology innovators for purposes related to training artificial intelligence systems in a manner that respects personal and intellectual property rights.”

It added that Stability AI “did not seek any such license from Getty Images and instead, we believe, chose to ignore viable licensing options and long‑standing legal protections in pursuit of their stand‑alone commercial interests.”

The Getty Images suit follows the filing of a class action lawsuit by a number of artists against Stability AI, Midjourney, and DeviantArt for creating “products that infringe the rights of artists and other creative individuals under the guise of alleged ‘artificial intelligence.'” – Rappler.com

Add a comment

Sort by

There are no comments yet. Add your comment to start the conversation.

author

Victor Barreiro Jr.

Victor Barreiro Jr is part of Rappler's Central Desk. An avid patron of role-playing games and science fiction and fantasy shows, he also yearns to do good in the world, and hopes his work with Rappler helps to increase the good that's out there.