As Tennessee's ELVIS Act goes into effect, the music industry braces for legal fallout
Tennessee's ELVIS Act goes into effect on Monday, ushering in a new era of legal ramifications for those who abuse AI in their music-making process as the public questions what will come next.
The act, which is short for The Ensuring Likeness Voice and Image Security Act, passed on March 21 when Gov. Bill Lee signed it into law in downtown Nashville's Broadway at the Honky Tonk Robert's Western World.
The ELVIS Act enacts voice protections against deepfakes and unauthorized uses of artists' voices and likenesses. And as it goes into effect, it's time to assess how the experts think the new law will affect the way we produce and consume art.
Some industry professionals believe the answers about balancing tech, creativity and the law will only come with time. Others, like technology companies, are racing to create software to combat musical AI abuse while some music organizations file landmark lawsuits against AI music companies.
A dozen interviews with lawyers, tech companies and music trade organizations revealed answers about what to expect in the coming months.
ELVIS Act expands vocal protections for all, not just music creators
The ELVIS Act adds artists' voices to the state's current Protection of Personal Rights law, which can be criminally enforced by district attorneys as a Class A misdemeanor.
Artists—and anyone else with exclusive licenses, like labels and distribution groups—can sue civilly for damages as well.
Tech companies who are developing these AI tools can also be held liable for providing the software that produces illegal vocal clones.
But there's more—the law expands the public's ability to sue for any form of vocal cloning; the product doesn't have to be made for commercial use or financial gain.
Under the new law, someone who creates an AI-generated song and posts it to streaming platforms for financial gain can be held liable just as someone who creates a deepfake of someone saying a slur.
Experts predict ELVIS Act will lead to onslaught of legal action
One of the groups who will be utilizing the new legal infrastructure, and who was integral in advocating on behalf of the new legislation, is the Recording Industry Association of America (RIAA).
"I anticipate that a lot of voice clones to whom we have been sending notices for over a year will understand that the law is much more clear now," said CEO and Chairman of the Recording Industry Association of America, Mitch Glazier.
Over the past year, Glazier said he has sent about 7,000 notices nationwide to those who have unlawfully utilized vocal cloning.
"We are hopeful that starting July 1, they will know that they have to take much more seriously notices that we send that they immediately remove any kind of unauthorized voice clones from their user uploaded platforms," he said.
"But if they don't, we will now have a very strong direct law under which we can sue them directly," Glazier added.
He said that some companies and creators take the warning notices seriously already, while others dismiss them. Even though this is a Tennessee law, Glazier emphasized that the RIAA—and others—only need to find a connection to Tennessee to bring a lawsuit.
If a song isn't created in Tennessee, organizations may be able to find Tennessee ties through a label or distributor to bring grounds for a suit.
Though Glazier said it's hard to predict, he anticipates a lot of the ELVIS Act's effect will happen behind closed doors. Many of these legal notices are private, and even if a suit is brought about, he said that usually most parties would rather settle quietly.
So, while this new legal infrastructure is in place, the public at large may not see it working.
Entertainment lawyer David Johnson, a partner on Lippes Matthias' Intellectual Property Team, agreed with Glazier that matters will be settled quietly until a couple of companies or artists find their claims too valid to shy away from a public legal battle.
"It really is a practical milestone on what is or is not allowed," Johnson said.
"We're going to see the other states as they are passing new laws," Johnson said. "They'll consider this, they'll look at the issues that are happening in Tennessee, and they'll make their own tweaks, and everybody will... advance together."
Already, similar legislation is being considered in California and Kentucky. Colorado and Utah have also enacted bills that protect consumers in certain interactions with AI.
ELVIS Act's legal critics scared it will impact the right to tell stories
Some entertainment lawyers have their reservations about the ELVIS Act, especially those in states where similar legislation to the new act is on their docket. These industry professionals think some storytelling tools that are not AI based, but other forms of vocal mimicry, could be in jeopardy.
Lisa Callif, an entertainment lawyer based in Los Angeles, works for the firm Donaldson Callif Perez and specializes in representing independent film producers and production companies.
Callif's work focuses on whether or not someone's right to privacy or publicity is affected by any personal rights they may have.
"I do feel like this was rushed into," Callif said of how quickly the bill passed; it was introduced on January 10 and passed on March 21. "I think there was a lot of excitement in the music industry," Callif said.
From the point of view of filmmakers, Callif believes the new statute will affect her clients' First Amendment rights to tell a story in a way that was unintended in the language in the act.
Callif said that digital AI reenactments in films can be equally, if not more compelling, than using archival footage or a staged reenactments to tell a story.
"Our viewpoint has pretty much been, as long as we're not deceiving the public and as long as it's really clear that this was created with AI, it's really no different than doing a reenactment or using archival materials to tell the same story," she said.
"I feel like that point...is really important," she added. "And I don't think that's really addressed in the statute."
David Johnson's coworker at law firm Lippes Matthias, Matthew Asbell, has concerns about the "cross border aspects" of the ELVIS Act.
One example Asbell provided is Disney's need to reproduce their musical movies in different languages. For a hit like "Let It Go" from "Frozen," he said, "You go and you find sound-alikes in every country around the world to sing it in their language, who really do sound very similar."
"I would worry about that sort of situation where you have not necessarily used AI, but you have the potential for liability for sound-alikes where there's actually a good reason to need a sound-alike," Asbell said.
While Callif and Johnson had critiques of the bill, they both see its value and importance to creators.
Some lawyers were quick to point out, too, that there were legal precedents set to protect vocal likeness prior to the ELVIS Act.
40 years before the ELVIS Act, there were legal cases about vocal likenesses
A 1988 court case in California set a precedent for vocal cloning long before AI could clone performers, back when all there were was impersonators.
"I think the concept of using someone's voice—without authorization and in a deceptive way—isn't something that's ever really been acceptable," Callif said. "So I don't think that that part of the law, which seems to be the biggest part of the law, really changes all that much."
In 1985, an advertising agency used songstress Bette Midler's likeness in a Lincoln-Mercury commercial for Ford Motors.
When "The Rose" singer turned down participating in the commercial, the agency found an impersonator who had worked closely with Midler to sing one of her songs for the commercial instead.
The impersonator mimicked Midler so well that fans mistook it for Midler's voice in the commercial, so Midler sued the company.
The Los Angeles federal court sided with Midler, who collected $400,000 in 1992. The Supreme Court upheld the decision.
In 1992, Don Engel, a Los Angeles attorney, told the LA Times, "The fact that the Supreme Court let this ruling stand represents a major expansion of the right of publicity. It will certainly cause ad agencies to sit up and pay attention.”
And though this case occurred nearly 40 years ago, Tennesseans are hearing plenty of the same rhetoric surrounding the new vocal protections statute today.
Lawyers think ELVIS Act will cause Tennesseans to 'pay attention'
"I think that Tennessee, by leading the charge and taking this first step—even if it's not seen as like the perfect step by many of the critics, nonetheless, we did something," said Nashville music entertainment lawyer Farrah A. Usmani of Nixon Peabody.
In a songwriting city, Usmani expects to see some conflict around the use of AI voices layered on demo tracks recorded in Nashville. She's already seeing it with her clients.
"There's some people that are really embracing it and think it's like kind of a time saving measure," she said. "And then we've also heard that other artists are very much saying that if they receive demos that mimic or AI filter their vocals, that they're not even going to listen to it. They'll be offended by it."
Time will tell where the city stands with AI demos.
Beyond the music industry, Usmani thinks the law's inclusion of purposes beyond commercial distribution is a win.
"The person who's kind of the victim of the deep fake, (the other states' laws don't) necessarily provide them with any kind of remedy or recourse, because if you're not using it for commercial purpose, historically, you weren't necessarily entitled to this full spectrum of protections," she said.
But the kicker for Usmani is the new law's ability to add more clarity for generative AI companies and attorneys counseling these companies.
Usmani said that any tech companies who are creating AI tools for vocal deepfakes should thoroughly comb over their contractual terms. "I think that companies need to explicitly state that these tools aren't to be used for illegal purposes," Usmani said.
Nashville tech company races to create tools to battle unethical AI in music
Now, technology manufacturers on the other side of the equation are trying to develop ways to tell listeners whether the music they're listening to is authentic or AI-created.
It's a race to see which see which company will bring their product to market first.
One of these companies is the Nashville-based ViNIL, a digital certification, tracking and licensing service startup that hopes to launch by this fall. ViNIL has developed a fingerprinting technique that preemptively allows content creators to approve and authenticate that what you hear and see is legitimate.
ViNIL was started by Nashville entrepreneur and singer-songwriter Charles Alexander, entertainment lawyer Jeremy Brook, who focuses on intellectual property, and software whiz and ex-game developer Sada Garba.
ViNIL's creators say their new software is intended for artists, record labels, rightsholders and celebrity estates, but can be used beyond just music.
Their tech allows a stamp of approval from the creator themself (think of Instagram's blue check mark system).
How does it work? The new software will feed music through a "Media Processing Engine." The engine allows ViNIL to uniquely tag, fingerprint, and generate a cryptographic stamp of the content.
From there, ViNIL can track which works the artists approve. Creators can also approve AI-generated works of themselves, stamping it as approved for widespread consumption (for instance, Randy Travis could stamp his AI-created song, one he recently crafted after he suffered a stroke that impacted his voice).
For Alexander, Brook and Garba, their main goal is to ensure artists maintain control, give consent and are properly credited and compensated.
Though digital watermarks can be removed, the ViNIL founders believe they've developed an air-tight strategy that tags content from the ground up and cannot be extracted by bad actors.
Once launched, their goal is to become a seamless part of content creation and distribution, Alexander said, working behind the scenes to become a commonplace technology in authenticating content.
When it comes to what the music industry is asking from the tech world, many professionals are also concerned with how the AI models are learning.
On June 23, the RIAA sued AI production companies Suno and Udio in New York and Boston federal courts, alleging that the companies are illegally using copyrighted sound recordings to "train" generative AI models.
"Winners of the streaming era worked cooperatively with artists and rightsholders to properly license music. The losers did exactly what Suno and Udio are doing now," an RIAA spokesperson said.
Glazier's call to action for the tech companies moving forward? He wants new technology that can help organizations see the input and training data that AI companies are using.
"These AI companies are training on artists, voices, copyrighted work images, but they're not keeping records. If they are keeping records, they're not being transparent about it. We have to then sue them in order to get access to what they trained their system with," he said.
Glazier seeks transparency from these AI companies, and if he can't find it, he seeks a software that can safeguard for him or legal recourse.
Audrey Gibbs is a music reporter at The Tennessean. You can reach her at [email protected].
This article originally appeared on Nashville Tennessean: Tennessee's ELVIS Act goes into effect on July 1. Here's what we'll see next.