‘There is no legal or indeed moral excuse for the commercial use of music by AI companies without the prior permission of songwriters and rightholders.’


MBW Views is a series of exclusive op/eds from eminent music industry people… with something to say. The following comes from John Phelan, Director General of ICMP, the global trade body representing the music publishing industry worldwide.

ICMP’s membership spans the majors, thousands of indies and 76 national trade associations across six continents. ICMP says that it “defends the rights behind approximately 90% of the world’s commercially released music – more than 160 million tracks, of every genre”.


Anyone saying music, politics, law and technology shouldn’t mix is whistling in the wind. It happens daily. Today it happens in a big way, as The EU Artificial Intelligence Act hits the statute books. Given this landmark, let’s dive a little into the ramifications for the music industry.

No discussion on the topic should start without first answering: ‘what do you mean by AI?’

At ICMP we categorise this broad umbrella term into 5 activities relevant for the music business:

  • Longstanding B2B and B2C services
  • AI assisted creative processes
  • At scale, automated, accessing of digital content (“ingestion”)
  • Model training
  • Generative AI, including purely machine generated works (the truest definition of ‘AI music’)
AI – Not Novel and Blowing Minds

At a business level, many forms of AI have been integrated in the industry for years.

They are deployed to help licensees search databases for the right song to synch with ad campaigns, or to process metadata, or by A&R teams to find new talent. The industry even uses AI technologies to tackle pirated AI music.

For music fans, Brian Eno made extensive use of GenerativeAI in his 2017 album ‘Reflection’.

Last year, U2 blew concert-goers’ minds in the Vegas desert with spectacular GenAI visuals at the MSG-owned Sphere.

And has there been a more beautiful symbiosis of AI and creativity than the model training which revivified Randy Travis’ voice for his lyrics enabling him to release new music after a debilitating stroke 10 years ago?

AI technologies, including GenAI, can aid creativity and drive royalties – but only with the prior consent of creators and under licensed terms.

So Why New AI Laws?

Too many AI companies and Big Tech companies with AI divisions don’t share the same motives.

Many are using AI techniques to automatically access and train on the world’s digital music, without songwriters and rightholders’ authorisation.

Microsoft’s AI CEO has fantastically claimed knowledge of a “social contract” for the internet, apparently existing since the 1990s, enabling his company to use content posted online – including music – unless told otherwise.

To paraphrase Wolfgang Pauli, this is ‘not even wrong’.

Such claims are not just doggerel however – it’s a very deliberate tactic of some ‘Big Tech’ companies to try and maximise copyright exceptions for use of other industries’ work and maximise copyright protection for their own.

Such business practices do not just infringe rights and are unethical, they are knowingly hypocritical. Buried deep within such companies’ own AI T&Cs are strict obligations of no use by anyone without express, prior authorisation. See a snippet in our GIF of Microsoft’s commercial double standards and just how ‘social’ their contracts are.

OpenAI’s Chief Technology Officer, of all people, claims she “isn’t sure” if the company is scraping anything at all – nevermind the world’s music – from our digital service partners such as Instagram, YouTube, Facebook etc.

Well, more and more governments are sure. Circling around the words “publicly available” won’t cut the mustard in any attempted defence against a claim of copyright infringement.

But don’t fall for AI companies’ publicly feigned ignorance. Behind closed doors we know they know precisely what they are doing.

In February OpenAI told the British government “it would be impossible to train today’s leading AI models without using copyrighted materials”.

This from a company whose founding statement proclaimed “our goal is to advance digital intelligence in the way most likely to benefit humanity, unconstrained by a need to generate financial return”.

It soon found itself in an unconstrained moment when claiming it needed up to $7 trillion (sic.) in investment to be able to deliver value.

Want insight on GenAI music on Udio? Depeche Mode fans can decide if this sounds – and importantly legally reads – like the great Dave Gahan & Co. ICMP’s companies have made up their minds.

If an individual permanently takes a digital song from a licensed music service without permission,  that’s simply ‘streamripping’. AI companies scraping the internet and making unlicensed GenAI music is glorified streamripping on an industrial scale.

Our team will continue to convey extensive evidence to authorities. Because left untackled, unlicensed GenAI music presents profound and present risks.

These risks include to future revenue streams in licensed User Generated Content (UGC) markets, the value of which our industry has worked so hard to drive up in recent years post-Article 17. This issue matters all the more considering approximately 65% of all internet traffic today is SVOD (Streamed-Video-on-Demand) content, up 9% from 3 years ago.

Preventing AI music from diluting royalty pools and stream manipulation also requires constant vigilance.

When our member companies protect their songwriters by removing unlicensed music from YouTube, Google now offers users ‘replacement music’ from YouTube’s Audio Library. Where’s that music coming from?

These issues and many more are all key as we work with governments on AI laws.

Demystifying AI

Marketers tactically anthropomorphise AI, variously aiming to befuddle governments, entice VC investment and vernacularise legal terms. Even the very name itself after all.

Demystifying AI is key. It does not “ingest”, it scrapes digital content. It does not “hallucinate”, it makes basic errors when datasets for model training are stripped of key metadata (for example when scraped from social media). You can call it “training” or automated algorithmic assessment of datasets, but the bottom line remains the same – AI does not choose whether to disobey copyright laws, companies and their CEOs do.

AI companies cannot claim copyright isn’t important in the scheme of things. Copyright is the scheme of such things.


Multi-Pronged Response

Malpractices such as the serious examples above are the reasons our industry is engaged in a multi-pronged response and were the catalysts for our board’s instructions to ICMP to work towards copyright enforcement provisions in the EU AI Act (and likewise in more than 40 different draft AI laws worldwide!).

In 2021, we analysed each national and international copyright legislation we had. Consistent conclusion? Existing copyright laws are robust enough to tackle most issues.

There is no legal or indeed moral excuse for the commercial use of music by AI companies without the prior permission of songwriters and rightholders.

Consequently, our early mantra to our industry has been: ‘Protection first, then potentially pivot to profit’.

ICMP members are taking no shortcuts in robustly defending their songwriters and composers. Concord is helping spearhead a copyright infringement case against Anthropic. Sony Music Publishing wrote to more than 700 AI companies about the need to secure licenses for use. Warner Chappell have done likewise. Universal Music Publishing stoutly defended their songwriters when TikTok threated to deluge the platform with AI music containing no human input. Reservoir is using AI to drive b2b efficiencies. BMG is innovating in AI research and Kobalt are deploying AI for licensing admin efficiencies.

On top of this, www.RightsAndAI.com now protects millions of creators as music publishing companies, labels and CMOs reserve their rights against unlicensed AI. Got music rights and want to see a fair future? You can sign up there.

US label trade body The RIAA is managing infringement cases against two companies of serious concern: Suno and Udio. The RIAA team have the complete support of our industry.

So what’s in the landmark AI Act, formally published today and entering into force across the EU in 20 days? Plenty. It’s worth recalling some of the extra ordinary circumstances in how it came about.


A Unique Political Battle

EU lawmaking is complex. 27 national governments, 705 European Parliament politicians (then) and the powerful European Commission executive are the playing field’s main bases.

The AI Act posed more complexities than most campaigns. For example, France’s senior government Minister for Digital Affairs Cedric O held the line in 2022 of “we need more AI regulation”. Last year having jumped ship to Europe’s leading AI company Mistral he was corralling the very highest levels of French government to a changed tune of “the EU’s AI Act could kill our company”.

Remarkably, late in the day France and Germany – both home to creative industry powerhouses – aligned with more copyright sceptical governments in their reluctance to agree copyright relevant provisions in the AI Act. There was a fear of dissuading AI investment.

It’s in that context the music, sports, TV, photography, book, gaming, newspaper and other trade bodies worked in close concert.

Smart, tenacious and detail-orientated Members of the European Parliament such as MEPs Dragoș Tudorache, Axel Voss and Iban García del Blanco backed us all and dug in, impervious to serious political pressure. The European Commission remained open to solutions and with some diplomatic drafting ended up in the right place.

The final stages of the high-level political negotiations took 37 hours in a windowless room. ICMP’s board, based across 5 continents, were getting updates round the clock in what was a dramatic endgame.

In the end, of 618 votes cast in the European Parliament – involving hundreds of politicians of Far Left, Far Right, Green, Pirate (Yes), Centrist, Liberal, Socialist and Centre Right persuasions – 523 backed the deal. All 27 EU governments voted unanimously in favour. That’s a whopping endorsement in anyone’s book.

Europe’s political appetite to rigorously tackle AI malpractices is now strong.


What’s In The Deal?

Songwriters and music companies can take confidence in the AI Act. Among other things, AI companies’ will be obliged to:

  • Comply with existing copyright laws (a crucial restatement which brings its own, separate waterfall of obligations).
  • Develop policies to respect rights reservations such those on RightsAndAI.com
  • Disclose use of copyright protected music.
  • Observe this transparency rule irrespective of where in the world AI training takes place. This is a big deal. For example, if a US-based tech company outsources scraping and training to a company in Singapore, as soon as the US generated output is available online in Europe the US company is on the hook and must know about the source and content of the Singaporean datasets.
  • Retain detailed records on training data.
  • Watermark GenAI.
  • Label ‘Deep Fakes’.

Financial and market sanctions for non-compliance are built in.

There will be no more room for AI companies to shift responsibilities to 3rd parties.

Consequently, it’s not hyperbole to say the AI Act matters for absolutely everyone in music, even far beyond Europe.


What next?

Now the EU law is out’, ICMP is working on its future enforcement. It’s time (as always at a trade body) to move fast and fix things. We will use these laws to ‘judo flip’ malpractices and work to end infringement.

A new EU AI Office to monitor compliance is being set up. Our team’s engagement there is ongoing and will be so not for months, but likely decades.

Our staff are also working to support our 76 national groups with the many governments worldwide who are analysing the EU Act for inspiration.

In India – the world’s largest software economy – we applauded Industry Minister Goyal’s statement that AI companies must respect the economic rights of rightholders.

China’s internet courts have issued the world’s first GenAI copyright infringement ruling.

Next up among many AI law reforms are Hong Kong, Mexico and Singapore.

I’m unsure any issue has ever been tackled by so many governments, so simultaneously and so speedily.


The Best Transparency Is A License

In parallel, music publishing companies have made it abundantly clear their doors are open for good faith licensees.

Those AI companies who claim licenses are technically too hard? Untrue. Aside from legal examples already on the market, any AI company who can’t figure out an ISWC code should probably exit the sector via stage left.

The role – and persistent frothiness – of financial markets require factoring. ROI on multiple hundreds of billions of dollars in seed capital and VC funding of AI companies remains relatively negligible. As the great financier Ben Graham said “any time there’s a lot of speculation in the market, it gets corrected eventually. Any business, in the short run it’s a voting machine, in the long run it’s a weighing machine”. AI markets can only deliver real weight and value via proper copyright compliance.

So, it’s fair to say there are many interesting roads ahead, as Artificial Intelligence reckons with real rights.Music Business Worldwide





Source link