Earlier this year, a band called The Velvet Sundown racked up hundreds of thousands of streams on Spotify with retro-pop tracks, generating a million monthly listeners on Spotify.
But the band wasn’t real. Every song, image, and even its back story, had been generated by someone using generative AI.
For some, it was a clever experiment. For others, it revealed a troubling lack of transparency in music creation, even though the band’s Spotify descriptor was later updated to acknowledge it is composed with AI.
In September 2025, Spotify announced it is “helping develop and will support the new industry standard for AI disclosures in music credits developed through DDEX.” DDEX is a not-for-profit membership organization focused on the creation of digital music value chain standards.
The company also says it’s focusing work on improved enforcement of impersonation violations and a new spam-filtering system, and that updates are “the latest in a series of changes we’re making to support a more trustworthy music ecosystem for artists, for rights-holders and for listeners.”
As AI becomes more embedded in music creation, the challenge is balancing its legitimate creative use with the ethical and economic pressures it introduces. Disclosure is essential not just for accountability, but to give listeners transparent and user-friendly choices in the artists they support.
A patchwork of policies
The music industry’s response to AI has so far been a mix of ad hoc enforcement as platforms grapple with how to manage emerging uses and expectations of AI in music.
Apple Music took aim at impersonation when it pulled the viral track “Heart on My Sleeve” featuring AI-cloned vocals of Drake and The Weeknd. The removal was prompted by a copyright complaint reflecting concerns over misuse of artists’ likeness and voice.
CBC News covers AI-generated band The Velvet Sundown.
The indie-facing song promotion platform SubmitHub has introduced measures to combate AI-generated spam. Artists must declare if AI played “a major role” in a track. The platform also has an “AI Song Checker” so playlist curators can scan files to detect AI use.
Spotify’s announcement adds another dimension to these efforts. By focusing on disclosure, it recognizes that artists use AI in many different ways across music creation and production. Rather than banning these practices, it opens the door to an AI labelling system that makes them more transparent.
Labelling creative content
Content labelling has long been used to help audiences make informed choices about their media consumption. Movies, TV and music come with parental advisories, for example.
Digital music files also include embedded information tags called metadata, which include details like genre, tempo and contributing artists that platforms use to categorize songs, calculate royalty payments and to suggest new songs to listeners.
Canada has relied on labelling for decades to strengthen its domestic music industry. The MAPL system requires radio stations to play a minimum percentage of Canadian music,

