$125m-backed Suno is being used to make racist and antisemitic music


Suno, the increasingly popular and well-funded AI-powered music generator, and whose tech has been integrated into Microsoft products, has been used to create music with hateful messages, according to a report from the Anti-Defamation League (ADL).

The ADL says it has uncovered “a vast library of disturbing songs created with Suno,” including songs that glorify Hitler and “white power,” tracks that employ racial slurs and others that spread racially-charged misinformation about the Israel-Hamas war and the Covid-19 pandemic.

The tracks highlight “the critical need for stronger content moderation guidelines and heightened awareness of the myriad ways in which bad actors are weaponizing” generative AI, the ADL said in a blog post published on Friday (June 14).

Among other things, the blog post shared screenshots from the Suno platform of a gangsta rap track called Squatting for Hitler, whose lyrics refer to a “national awakening” on behalf of the Nazi leader:


Image via ADL

Another screenshot featured the lyrics to a track called Wuhan Blues, which “includes bigoted [anti-Asian] tropes and stereotypes about the COVID-19 pandemic,” the ADL said.


Image via ADL

Suno’s terms of service state that the service forbids content that is “unlawful, harmful, threatening, abusive, harassing, tortious, excessively violent, defamatory, vulgar, obscene, pornographic, libelous, invasive of another’s privacy, hateful, discriminatory, or otherwise objectionable.”

However, the ADL says users have been able to get around restrictions on hateful content through a variety of tricks that get past the AI, and have been sharing those tricks on social media platforms.

The organization said it has contacted Suno and Microsoft about these songs, but “did not receive a response”.

However, it appears that Suno has taken action, at least with respect to the tracks flagged by the ADL. In a search of Suno’s catalog on Tuesday (June 18), MBW was unable to find any of the tracks cited in the ADL’s report.

The report comes a month after Suno closed a $125-million Series B funding round that valued the company at a reported $500 million. Investors in the firm include VC firms Matrix Partners, Lightspeed Ventures, and Founder Collective, as well as former GitHub CEO Nat Friedman, and Andrej Karpathy, a co-founder of ChatGPT maker OpenAI.

Suno also has the backing of Microsoft, in the form of a partnership that has seen Suno’s tech integrated into Microsoft’s Copilot AI app.


For the music industry, Suno is a growing source of concern, amid suspicions that the company trained its AI on copyrighted music without authorization.

In a recent column for MBW, Ed Newton-Rex, former Head of Audio at Stability AI and the founder of the non-profit ethical AI accreditation organization Fairly Trained, showed the results of a forensic analysis of Suno-generated songs, which Newton-Rex said suggested the company had trained its AI on material from Ed Sheeran, ABBA, and Queen, among others.

One of the company’s early investors – Antonio Rodriguez of Matrix Partners – suggested to Rolling Stone earlier this year that he is prepared for Suno to be sued by copyright holders, but still preferred that the company didn’t sign licensing agreements with record companies because “they needed to make this product without the constraints.”


Suno’s problems with hateful content reflect those experienced in recent years by other digital companies in the music space, and by firms offering public access to AI technology.

Multiple investigations into music streaming services have uncovered hateful content on these platforms. In 2017, Spotify removed music uploaded by a number of white supremacist groups –  as defined by the Southern Poverty Law Center – after tracks from these bands were found on the streaming platform.

However, removing this content from music streaming platforms has proven to be a game of whack-a-mole.

In 2020, a BBC investigation uncovered more racist content on Spotify, as well as on Apple Music, Deezer and YouTube Music, including songs that featured “an excerpt of a Hitler speech, calls for ‘Aryans’ to make a brand new start and references to white power.”

In 2022, the ADL reported it had found “40 white supremacist artists with a presence on Spotify.” Although the platform had updated its rules following an ADL report on extremist content on Spotify, the rules “do not appear to be strictly enforced,” the ADL stated.

AI technologies have also at times been susceptible to these types of issues. Perhaps most famously, in 2016 Microsoft shut down Tay, a chatbot, after users managed to manipulate it into expressing hateful comments within hours of launch.

On Tuesday (June 18), the UN culture and education organization UNESCO issued a report warning that AI could be used to “distort the historical record of the Holocaust and fuel antisemitism.”

“ChatGPT and Google’s Bard, have both produced content detailing Holocaust-related events which never took place. ChatGPT entirely fabricated the concept of ‘Holocaust by drowning’ campaigns in which the Nazis drowned Jews in rivers and lakes, and Bard generated fake quotes from witnesses to support distorted narratives of Holocaust massacres.” UNESCO said.

In its report on Suno, the ADL issued a number of recommendations to address the issue of hateful content on its platform, including clearer terms of service, investing in trust and safety personnel, building “discouraging mechanisms” into Suno, and making text prompts publicly available to “demystify some of the coded language in problematic lyrics.”Music Business Worldwide



Source link