The CEO of Getty Images just delivered a public smackdown to AI copyright thievery


Here’s a prediction for 2025: There will be even more lawsuits over the unlicensed use of content to train AI models.

In the music industry, we’ve already seen legal action filed by major recorded music companies and publishers against companies such as gen AI startups Suno and Udio and AI giants Anthropic and Open AI.

In the wider world of media and entertainment, OpenAI has also been sued by the New York Times and authors including Sarah Silverman. Getty Images, meanwhile, is suing Stablity AI for allegedly using its photos without permission and the Dow Jones and New York Post are suing Perplexity for allegedly copying their work without a license.

The type of content in question across all of these lawsuits might be different, but the allegations are largely similar: AI companies are using copyrighted material to train their systems without permission.

One key argument made by some AI companies and their financial backers in response to the allegations of infringement is that training AI using copyrighted content available on the Internet is ‘fair use’ under copyright law.

In a scathing op-ed published by Fortune this week, the CEO of photography agency Getty Images, Craig Peters, has rubbished that argument, advocating instead for a nuanced approach to assessing fair use, supporting AI’s potential for societal good without undermining the creative industries – including music.

As Peters notes in the column for Fortune, Getty employs over 1,700 people and represents the work of more than 600,000 journalists and creators worldwide.

The company generated $240.5 million in Q3 (the three months to end of September) up 4.9% YoY and projects that its FY 2024 revenues will be between $934 million to $943 million

“Copyright is at the very core of our business and the livelihood of those we employ and represent.”

Craig Peters, Getty, in an op-ed for Fortune

“Copyright is at the very core of our business and the livelihood of those we employ and represent,” writes Peters.

He says that he “vigorously disagree[s] with the sweeping position outlined” by the likes of Microsoft AI CEO Mustafa Suleyman, who, Peters writes, has made remarks in the past that suggest there “no copyright protection for online content”.

Peters adds: “This disagreement underscores why we are litigating against Stability AI in the U.S. and the UK. We did not grant Stability AI permission to use millions of images owned and/or represented by Getty Images to train their Stable Diffusion model which was made commercially available starting in August of 2022.”

Peters notes that, “as litigation slowly advances, AI companies advance an argument that there will be no AI absent the ability to freely scrape content for training, resulting in our inability to leverage the promise of AI to solve cancer, mitigate global climate change, and eradicate global hunger.”

He adds: “Note that the companies investing in and building AI spend billions of dollars on talent, GPUs, and the required power to train and run these models — but remarkably claim compensation for content owners is an unsurmountable challenge.”

Peters also argues that ‘fair use’ “should be applied on a case-by-case basis” and that AI should not be viewed as “one monolithic case”, but should be treated  as “a wide range of models, capabilities, and potential applications”.

He asks in the op-ed: “Does curing cancer impact the value of Kevin Bacon’s performances? Clearly no. Does solving for climate change impact the value of Billie Eilish’s music?  Clearly no.

“Does solving for global hunger impact the value of Stephen King’s writing? Again, clearly no. Not only does it not harm the value of their work, they would likely never challenge such a use if it could benefit those aims even if such a use might be commercial in nature. As the CEO of Getty Images, I can say we would never debate or challenge these applications and that we would wholeheartedly welcome any support we could offer toward these applications.”

Peters argues further that “content generation models” that generate “music, photos, and videos based on text or other inputs” and have been “trained on the content of artists absent [of] their permission” do not have “the potential to elevate our societal outcomes”.

He says that this use of AI is “pure theft from one group for the financial benefit of another”.

“Let’s stop the rhetoric that all un-permissioned AI training is legal and that any requirement to respect the rights of creators is at the expense of AI as a technology.”

Craig Peters, Getty

Getty’s CEO also suggests parallels between the evolution of the AI content sector toward the emergence of licensed players in this space and the rise and fall of Napster and other illegal download services, which gave way to licensed streaming platforms like Spotify.

“As the licensed models of Spotify and Apple Music evolved from the infringing original Napster, there are AI models developed with permission and with business models that reward creators for their contributions,” writes Peters in his op-ed for Fortune.

He adds: “Like Apple Music and Spotify, they will cost a bit more, but they can thrive and be broadly adopted if we create a fair playing field by addressing those companies that choose to ‘move fast and break things’, in this case, break established copyright law.”

Peters’ piece concludes with the argument that there “is a fair path that rewards creativity and delivers the promises of AI”.

He adds: “Let’s stop the rhetoric that all un-permissioned AI training is legal and that any requirement to respect the rights of creators is at the expense of AI as a technology.”Music Business Worldwide



Source link