My Voice Was Cloned: Seeking Justice in the AI Era

Clara Alex
4 min readJun 3, 2024

--

Written by Ana Balashova for Kill the DJ

Photo by Tingey Injury Law Firm on Unsplash

Unauthorized voice cloning, an AI-powered Pandora’s box that’s causing more chaos than an overly energetic air guitarist at a karaoke bar. Amongst other things, it was quickly hijacked by scammers and turned into a nightmare for artists and the entertainment industry.

According to Michael Hasse, a Cybersecurity and Technology Consultant, voice cloning is pretty simple: “Individuals have a relatively limited range of pitches in normal speech, and their use of language will have a typical rhythm. Before LLMs, it took a lot of samples to get a good reproduction, but with modern systems, it can be done with a single sample of a few seconds as the LLM can match that against the known “library” and fill in the blanks quite accurately.”

In other words, we’re all just a bunch of predictable meat sacks, and AI has figured out how to mimic us with terrifying accuracy.

Sure, voice cloning has been used for some good causes, like helping introverts start podcasts or assisting those who have lost their voices for various reasons. But the dark side of this technology is starting to rear its ugly head.

Voice actors and musicians are finding themselves in the crosshairs of unauthorized cloning, with their voices being used in all sorts of unsavory contexts. Bev Standing, a voice actress, sued TikTok for allegedly using her voice without consent or compensation. Apparently, her sweet tones were used in videos featuring “foul and offensive language,” which is a bit like finding out your angelic voice was used in a Quentin Tarantino film.

Scarlett Johansson accused OpenAI of using her voice for “Sky.” (This was not the first time she went after tech bros for using her voice.)

And it’s not just living artists feeling the pinch. The estate of George Carlin, the legendary comedian, sued Dudesy Media Company for producing an AI-generated comedy special mimicking Carlin’s voice and style. The unauthorized special, titled “George Carlin: I’m Glad I’m Dead,” racked up nearly 500,000 views on YouTube, proving once again that the internet loves a good dumpster fire.

Even in the world of video games, AI voice cloning is causing a stir. Skyrim voice actors found their performances replicated in mods created for less-than-savory purposes. It’s one thing to slay dragons, but dealing with AI clones in dubious fan mods? That’s a quest no one signed up for.

As the lines between innovation and infringement become blurrier than a Monet painting, artists wonder how to protect their voices and who should be asked for help. In this article, we attempt to figure that out.

Untangling All Things Legal

So, is there something you can do if someone cloned your voice? Should Kanye West and Joe Biden both silently suffer through numerous AI-generated songs using their voices plaguing YouTube?

We’ve asked legal experts how things are going. The main question to start with is, “Who owns the voice?”

According to Terry Quan, partner at FLG Foundation Law Group: “The voice of person comprises a part of a person’s right of publicity. As there is no federal law covering the right of publicity, the specific law for the right of publicity is based on statutes adopted on a state-by-state basis and based on common law (that is, based on case law). In California, the right of publicity is embodied in California Civil Code §3344 and protects a person’s name, voice, signature, photograph, and likeness and it enables a person to control their voice for their own economic benefit. So, in the case of a music artist, the music artist ‘owns’ their own voice as part of the music artist’s right of publicity.

“In the context of a record label with a contractual relationship with a musical artist, the appropriate question might be, ‘Who owns the voice recording as part of a song?’ This depends on what the contract says, and usually, the record label wants the broadest rights possible to publish and monetize a voice recording (along with the music). However, there are many copyrights involved in terms of the ownership of a song where the record label usually owns the ‘master’ (that is, the recorded performance of a song), and the performer/songwriter could own the copyright to the lyrics and music.

“It is possible that different record labels may have rights to different versions of a song performed by an artist, e.g., one label might own a studio recording, and one label might own a live performance. Each label would own its own version of that voice recording, and what the label could do with it would be governed by the respective contracts negotiated with the artist. Some artists, such as Taylor Swift, exercise a lot of control and may prevent the use of voice recordings as training material for generative AI platforms.”

According to John Michael Eden, Counsel at BurgherGray LLP, in many states, voice and likeness are not protected. “There aren’t very clear laws stating you have a right of action against someone mimicking your voice and robbing you of the right to make money off your own content. There are, of course, exceptions for celebrities — for people who make a living from monetizing their likeness. There have been, however, some legislative changes in certain states. The Elvis Act in Tennessee, going into effect July 1st, 2024, gives an individual a right to protect their voices as a bona fide intellectual property right. The law covers both actual and simulated voices, and violations constitute a misdemeanor that can lead to criminal and civil penalties. The scope of the Elvis Act is pretty clear: You can’t use someone’s voice to publish, perform, distribute, or transmit it without their authorization.”

🍿 Read more at Kill the DJ

--

--

Clara Alex
Clara Alex

Written by Clara Alex

Managing Editor at Kill the DJ. Content strategist in audio tech companies. Write about music, AI in audio, podcasting, and all things audio.

No responses yet