How AI Is Causing Trouble for Fans of Taylor Swift, Drake, and Other Pop Stars

Taylor Swift and Drake

In the last week, highly anticipated songs by Drake and Taylor Swift appeared to leak online, sparking enormous reactions. Massive discussions emerged, dissecting musical choices. Fans simulated other rappers’ reactions to being dissed by Drake. The rapper Rick Ross even responded to the song’s bars about him with a diss track of his own.

But there was one big problem: neither Swift nor Drake confirmed that the songs were real. In fact, loud contingents on social media claimed that the songs were AI-generated hoaxes, and begged fellow fans not to listen to them. Fervent discussions soon became debates in rabid hunts for clues and debates aimed at decoding the songs’ levels of authenticity.

These types of arguments have recently intensified and will only continue ballooning, as AI vocal clones keep improving and becoming increasingly accessible to everyday people. These days, even an artist’s biggest fans have trouble telling the difference between their heroes and AI-creations. They will continue to be stymied in the coming months, as the music industry and lawmakers slowly work to determine how best to protect human creators from artificial imposters.

The Advent of AI Deepfakes

AI first shook the pop music world last year, when a song that seemed to be by Drake and the Weekend called “Heart on My Sleeve” went viral, with millions of plays across TikTok, Spotify, and YouTube. But the song was soon revealed to have been created by an anonymous musician named ghostwriter977, who used an AI-powered filter to turn their voice into those of both pop stars.

Many fans of both artists loved the song anyway, and it was later submitted for Grammy consideration. And some artists embraced new deepfake technology, including Grimes, who has long experimented with technological advancements and who developed a clone of her voice and then used it to create songs using it.

But soundalikes soon began roiling the fanbases of other artists. Many top stars, like Frank Ocean and Harry Styles, have turned to intense policies of secrecy around their output (Ocean has locked much of his music to prevent leaking), resulting in desperate fans going to extreme lengths to try to obtain new songs. This has opened the door for scammers: Last year, a scammer sold snippets to Frank Ocean superfans for thousands of dollars. A few months later, snippets that purported to be taken from new Harry Styles and One Direction songs surfaced across the web, with fans also paying for those. But many fans said they were hoaxes. Not even AI-analysis companies could determine whether they were real, according to reports.

Drake and Taylor… Or Not?

This week, AI shook up the fanbases of two of the biggest pop stars in the world: Taylor Swift and Drake. First came a snippet of Drake’s “Push Ups,” a track that seemingly responded to Kendrick Lamar’s taunts of him in the song “The Heart Part 5” (“Pipsqueak, pipe down,” went one line from “Push Ups.”) The track, which also took aim at Rick Ross, The Weeknd, and Metro Boomin, quickly went viral, and Ross released a snippet of his own.

But the internet was divided as to whether or not the clip was actually made by Drake. The original leak was low quality; Drake’s vocals sound grainy and monotone. Even the rapper Joe Budden, who hosts the prominent hip-hop podcast The Joe Budden Podcast, said that he was “on the fence” for a while about whether or not it was AI.

A higher quality version of the song was subsequently released, leading many news outlets and social media posters to declare it as a genuine Drake song. Strangely enough, Drake has toyed with this ambiguity: He has yet to claim the song as his own, but shared Instagram stories containing people dancing to parts of it. Whether or not he made it, the song has become an unmistakable entry in a debate that has taken the hip-hop world by storm.

“Push Ups” has a reference to Taylor Swift: It accuses Lamar of being so controlled by his label that they commanded him to record a “verse for the Swifties,” on the 2015 remix of her song “Bad Blood.” On Wednesday, Swifties went into a frenzy when a leaked version of her highly anticipated new album, The Tortured Poets Department, began making the rounds online two days before its release date. Purported leaks have been popping up for months, including some that were eventually proven fake. Given all of the false trails across the web, many Swift fans dismissed these new leaks as fake as well. But the songs are also being treated as real by journalists, who are already announcing their favorite tracks and moments a day before the album’s official release.

— mina 🤹🏽‍♀️ (@tectonicromance)

Can the music industry fight back?

Some of these vocal deepfakes are not much more than a nuisance to major artists, because they are low-quality and easy to detect. AI tools often will get the timbre of a distinctive voice slightly wrong, and can glitch when artists use melisma—sliding up and down on a single syllable—or suddenly jump registers. Some pronunciations of lyrics also come out garbled, or with a slightly wrong accent.

But AI tools are constantly improving and getting closer to the real thing. OpenAI recently shared a preview of ChatGPT, their latest tool that generates natural-sounding speech mimicking certain speakers. Researchers and AI companies are racing to create voice clone detection software, but their success rates vary.

So some musicians and music labels are fighting back with the avenues they have available to them. Three major music publishers—Universal Music Publishing Group, Concord Music Group and ABKCO—sued the AI company Anthropic, alleging that the company infringed on copyrighted song lyrics. More than 200 musicians, including Billie Eilish, Stevie Wonder, and Nicki Minaj, recently signed a letter decrying the “predatory use of AI to steal professional artists’ voices and likenesses.” And BPI, a UK music industry group, spoke out against the vocal cloning service Jammable.

The music industry has growing support from lawmakers. Last month, Tennessee governor Bill Lee signed into law the Consumer and Artist Protection Act, which prohibits people from using AI to mimic an artist’s voice without their permission. And U.S. senators announced a similar bill called the Preventing Harmful Image Manipulation or Authentications Act. “We must put in place rules of the road to protect people from having their voice and likeness replicated through AI without their permission,” Minnesota Senator Amy Klobuchar said.

It will likely take a long time for this bill or other similar ones to wind their way through the halls of Congress. Even if one of them passes, it will be exceedingly hard to enforce, given the anonymity of many of these online posters and the penchant for deleted songs to pop back up in the form of unlicensed copies. So it’s all but assured that deepfaked songs will continue to excite, confuse, and anger music fans in the months and years to come.