Focue Provides the Latest and Most Up-to-Date News, What You Focus On is What You Get.
⎯ 《 Focue • Com 》

Music Streaming Has a $2 Billion Fraud Problem That Goes Beyond AI

2023-05-12 19:26
Staring at his computer screen, Kristoffer Rom couldn’t believe the numbers rolling in from Spotify. A year and
Music Streaming Has a $2 Billion Fraud Problem That Goes Beyond AI

Staring at his computer screen, Kristoffer Rom couldn’t believe the numbers rolling in from Spotify.

A year and a half earlier, in 2018, his independent music label, Tambourhinoceros, had released a languid, synth-pop song, titled Hey Kids, by Molina, a Danish-Chilean singer. The initial reception was modest. But then, months later, it started taking off on TikTok and YouTube as creators embraced the song as a catchy, mood-setting score for all manner of emotive videos and animations.

From there, the momentum spread to Spotify, Apple Music and other streaming services. By March 2022, the song was generating more than 100,000 streams per day. “It was amazing to see all that traction,” Rom said.

But his team’s initial excitement was soon tempered by an unsettling realization. The growing popularity of Hey Kids had not only caught the attention of TikTok and YouTube performers but also another more pernicious, if less widely recognized, mainstay of the modern media ecosystem — streaming music scammers. Taking advantage of the loose restrictions in an age of automated music distribution, such scammers have learned how to rake in money from mainstream music platforms, either by circulating minimally altered, copycat versions of popular songs and collecting the resulting per-stream payouts or by getting listeners to inadvertently consume their own music or ads by mislabeling uploaded content.

Much to Rom’s growing dismay, ripped-off versions of Hey Kids — slightly modified but largely indistinguishable from the real thing — were suddenly proliferating across the streaming music landscape, siphoning listeners away from Molina and unfairly pocketing the resulting streaming royalties. Worse yet, nobody at the major services appeared to be doing anything effective to stop the spread of the knockoffs.

“You have the ecstatic joy of people doing creative, great things with the music you’ve put out, on the one hand,” Rom said. “And the total frustration and anger of witnessing people trying to exploit it.”

Currently, much of the music industry is preoccupied with the newest threat — or maybe opportunity — to emerge from Silicon Valley. With AI-generated songs of mysterious provenance already going viral on streaming platforms, industry executives, most notably Spotify Technology SA Chief Executive Officer Daniel Ek, have been quick to promise heightened vigilance on behalf of labels, artists and copyright holders. But while the platforms are warily sizing up the shiny new disruptive force, labels and managers say that fraud of a more prosaic nature is already rampant.

Read More: AI Music Brings the Sound of Scammers to Spotify

Beatdapp, a company that works with services to detect and remove fraud, estimates that at least 10% of streaming activity is fraudulent. Applied across a vast scale of digital music, what can at first appear as small-time, garden-variety trickery adds up to sizable theft. Beatdapp said that the streaming subterfuge could amount to roughly $2 billion in misallocated revenue every year.

People in the music industry who spoke with Bloomberg say the majority of the problems they grapple with tend to surface on the biggest global streaming platforms, Spotify and Apple Music. By contrast, they say, Alphabet Inc.’s YouTube Music has been much cleaner. That’s in part because YouTube has for years maintained a powerful content ID system that often identifies infringing content and then allows rightsholders to either remove the fraudulent content entirely or monetize it themselves. (Unauthorized versions of Hey Kids on YouTube, for example, now divert any resulting ad revenue back to Molina’s team at Tambourhinoceros.)

A Spotify spokesperson said via email that “stream manipulation and content misrepresentation are industry-wide issues,” which it “takes seriously” and are “against our policies.”

“We have robust, active mitigation measures in place that identify bad actors, limit their impact and penalize them accordingly, including withholding royalties,” the spokesperson wrote. “We are continuously evolving our efforts to limit the impact of such individuals on our service.”

Apple Music didn’t respond to a request for comment.

Ben Gaffin, an artist manager and founder of Sound Advice, a music-services business that represents producers, artists and media companies, said he often encounters a particular type of streaming scam. Someone will make a track and distribute it across the streaming services while intentionally tagging it with the name of another, more successful artist. Afterward, thanks to the fallacious metadata, the platform’s algorithms will start automatically serving up the mislabeled track to the legions of fans of the real musician and incorporating it into popular playlists, generating a surge of unwarranted streams.

Sometimes, lesser-known artists will use this trick to try and draft attention from a popular act. Other times, the deceptively tagged track isn’t even a song but rather a speaker urging listeners to buy something on a particular website. Basically, a rogue ad.

During a recent interview with Bloomberg, Gaffin began hunting around for an example and quickly found one such track “featuring” his artist Clams Casino. By the time Gaffin happened upon it, the mis-tagged track had already tallied up over 55,000 plays on Spotify. Gaffin said he typically only finds out about counterfeits when he gets a notification from Spotify For Artists alerting him to new music being ready for release when, in fact, no new work is planned — or, when fans start posting angrily about a new track they dislike.

“It’s a vulnerability in the system that is being exploited,” Gaffin said.

Talya Elitzer, co-founder of the label Godmode Music, sees the same tactic targeting her artists a couple times a month. Often, she said, it takes streaming platforms up to a week to process her takedown requests.

“It seems like a fairly easy fix that every artist should have a code or security thing,” Elitzer said. “By the time you see it, it’s too late.”

Part of the challenge is that in the streaming age, more or less anyone can get tracks uploaded onto major streaming platforms with little scrutiny or oversight. There are many services, such as DistroKid, CD Baby, and TuneCore, that empower users to distribute their songs to the big platforms using do-it-yourself software. The process of distributing new music to retailers, which not long ago was a labor-intensive, hands-on process, has grown largely automated.

“Way, way, way back in the day, we had a team of people that listened to every CD that came in the door,” said Christine Barnum, the chief revenue officer at CD Baby. “Operating at this scale, that’s not feasible."

As the amount of amateur content being uploaded to streaming services has exploded, businesses like Spotify that were originally set up as outlets for professional musicians have started to look more like user-generated content platforms. Spotify said over 100 million songs exist on its service, and as of February 2021, 60,000 tracks per day were being uploaded. Apple Music and Amazon Music also recently said they offer listeners a 100-million-song catalog.

Vickie Nauman, founder and CEO of CrossBorderWorks, a music and technology consultancy, said that the growing scale at which streaming services operate is making it much easier for dishonestly labeled tracks to slip through.

“Certainly in the world before we had 100,000 songs uploaded a day, it was easier to monitor,” she said.

For the most part, the task of swatting down scammers falls on rightsholders who must manually submit takedown requests for each problematic track they identify, a process that can be particularly burdensome for small, independent labels.

To this day, executives at Tambourhinoceros continue to find new uploads ripping off Hey Kids. On some, the fraudsters have changed the name of the song, luring in unsuspecting listeners with variations of hashtags used on TikTok. Others feature slightly sped-up or slowed-down versions, seemingly tweaked to avoid fraud detection software while still sounding almost identical to the original work.

The most popular bogus upload they uncovered had accumulated more than 700,000 plays, possibly accounting for over $2,000 in lost revenue.

“That’s a lot of money for anybody but especially us, an independent label from Denmark,” Rom said. “We really need to get the money from what we actually do.”