Imagine Taylor Swift endorsing your product. Can't afford her? No worries, AI can fake it for you!
Thanks to AI deepfake technology, you can create a counterfeit but realistic Taylor Swift endorsement without involving her. Deepfakes use computer machine learning to create eerily realistic images, videos, and audio mimicking real people.
Scammers used AI to create a deepfake video in which a phony Taylor Swift announced she was giving away Le Creuset cookware sets, a real luxury cookware brand. After following some prompts, deceived shoppers were asked to pay a small shipping fee. If they paid, they didn’t receive cookware but did get hit with a hidden monthly credit card charge.
Tom Hanks got hit too. Scammers made a deepfake video of him to hawk dental insurance.
Faking ads and endorsements by living celebrities is obviously illegal. But what about deceased ones? What about mimicking government officials, such as a President? What if you replicate only the person’s famous voice but don’t identify the person? What if your fake endorsement is an obvious parody of the celebrity?
How Does AI Deepfake Technology Work?
The most common tool is a generative adversarial network, which techies call a GAN. You use a machine learning model, often a neural network, that creates what it thinks is the most convincing fake of an individual that matches the assigned task, such as a phony Taylor Swift endorsing cookware. This first network is called the “generator.”
You also have a competing machine learning model that tries to learn to discern whether something provided by the generator is fake or real. That second neural network is called the “discriminator.” It has been trained on both real and fake data.
The generator earns a reward based on how often it can succeed in faking out the discriminator. The generator and discriminator train together, and over time, the generator gets better at producing realistic fakes – ones good enough to fool the discriminator.
The Right of Publicity – The Primary Legal Weapon Against Deepfakes
Everything is legal if you get permission, specifically a license from the person whose name, image, or likeness (“NIL”) is used. For example, James Earl Jones allowed Disney to replicate his vocal performance as Darth Vader in future projects using an AI voice-modeling tool called Respeecher. Make certain the license covers the full scope of what you will do and the duration of use.
But what if you can’t buy a license from the celebrity? The law is clear that you can’t use the NIL of a living celebrity without permission for a commercial purpose, such as advertising or endorsement. Doing so violates the right of publicity, which is a person’s right to control the use of his NIL for a commercial purpose. In some circumstances, such conduct could also constitute false endorsement and perhaps is a trademark infringement, which are claims under federal law.
The problem is knowing where the line is drawn on legality. The right of publicity is the principal issue you might navigate. This is state-level law, and it varies from state to state.
About two-thirds of states recognize a right of publicity by statute, common law, or both. Other states usually have a “right of privacy,” which accomplishes roughly the same thing.
Most states, including Virginia, hold that the right of publicity protects everyone. Still, some states protect a person’s NIL only if it has commercial value, which essentially means celebrities (such as Massachusetts, Michigan, and Pennsylvania).
Most (but not all) states with a right of publicity hold that it continues after death, but the length of protection varies. In Virginia, protection lasts 20 years after death. In other states giving postmortem rights, the length runs from 10 to 100 years. Sometimes, the length of postmortem protection depends on whether the person is famous or whether the person’s estate has continued to exploit the deceased celebrity's NIL commercially.
Presume the Most Stringent Right of Publicity Law Governs Your Situation.
For a business advertising using the NIL of others, it’s best to presume your activity will be governed by the most protective right of publicity in the country. Presume that the right of publicity protects everybody’s NIL, not just celebrities and including politicians, and that it protects not only living people but anyone who lived in the past 100 years.
That’s because it’s difficult to determine which state’s law will apply to your activity. Each state has its own law for determining which state’s law applies in a lawsuit (called “choice of law”) and when lawsuits can be maintained in that state against a business or person that doesn’t reside there. Thus, your business’s advertising activity using someone’s NIL could result in you being sued in a different state and the application of a different state's law.
Don’t Try Just Mimicking Someone’s Voice Even if You Don’t Identify Him.
Don’t get cute by mimicking a celebrity’s voice while not identifying the person. Most states that recognize the right of publicity include someone’s recognizable voice.
For example, in the 1980s, Ford Motor Company produced an ad for the Mercury Sable using a voice impersonator singing “Do You Want to Dance” in Bette Midler's style without her permission. Midler sued Ford and won.
In the irony department, actress Scarlett Johansson recently accused OpenAI (the maker of ChatGPT) of using her distinctive voice without her permission for the latest version of ChatGPT 4.0 – a voice it called “Sky.” ChatGPT is rolling out a feature that allows you to interact with it by voice, like talking with Siri from Apple or Alexa from Amazon.
OpenAI CEO Sam Altman had contacted Johansson about being ChatGPT’s voice, but she declined. Interestingly, when OpenAI launched the Sky voice, Altman tweeted the single word “her,” which appears to reference the 2013 film “Her,” in which a man falls in love with an AI virtual assistant voiced by Johansson.
Johansson claimed Sky's voice is so similar to hers that her family, friends, and the general public told her it sounded like her. OpenAI pulled the Sky voice in response to her accusation. It claims it independently hired voice talent to create ChatGPT voices and settled on five different ones. OpenAI didn’t expressly deny that it sought a voice actress who sounds like Johansson, and it declined to name its voice actors “to protect their privacy.”
Reacting to rising AI voice mimicry, Tennessee recently enacted a law that imposes criminal and civil liability on using AI to mimic someone’s recognizable voice without permission. That makes sense, as Tennessee is the country music capital of the U.S. The law extends liability to people who knowingly publish a fake voice and, in the case of advertisers, when they should have known it was fake. It also extends liability to any company or individual producing AI technology with a “primary purpose or function” of making AI fakes.
The FCC Cracked Down on AI-Generated Fake Robocalls.
The Federal Communications Commission (“FCC”) recently ruled that robocalls using voices generated by artificial intelligence are illegal. AI-generated fake voices in robocalls are a problem. A recent robocall campaign using an AI-generated voice mimicking President Biden targeted New Hampshire residents to discourage them from voting in the state’s Democratic primary. AI has also been used to extort money from families by mimicking a loved one in danger or distress.
The FCC can fine violators and block telephone companies from carrying the calls. The FCC ruling also allows victims to sue the robocall originators and gives state attorneys additional tools to prosecute bad actors.
AI Fakes of Political Figures
What about political figures? Don’t you have a First Amendment free speech right to mimic them in advertisements? Generally, no. Politicians also receive protection against unauthorized use of their NIL for commercial purposes, including postmortem rights for as long as applicable state law gives such rights. Free speech principles don’t override that.
What if your AI fakery is obviously a parody, such as a phony Joe Biden endorsing hair-care products or counterfeit Donald Trump endorsing a gym chain? Trying this is legally risky. You will be liable if some in the public don’t get the joke, meaning some think the endorsement might be real. And even if everybody gets the joke, if a parody is an advertisement to sell some good or service, that commercial aspect might make it legally unprotected.
Sometimes, a parody is protected from liability if it itself is the product (for example, a Donald Trump doll with long spiky orange hair like a toy troll), but not if it is just an ad vehicle for selling some other product or service.
AI Fakes of Ordinary People
Finally, what if your AI-generated person happens to be a non-celebrity’s appearance and voice?
After all, you can use AI technology to learn someone’s appearance, voice, and mannerisms and then have the AI generate audio and video of that person saying or doing anything. Using AI could save time later by making it unnecessary to shoot new video for new ideas. Because the right of publicity in most states protects all people, this too requires getting a license to use the NIL of the person depicted.
So, don’t use AI to create a fake Taylor Swift endorsement for your business. You might be “Enchanted” by her market appeal, but when you get sued by her, it would be hard to “Shake it Off.”
Written on May 21, 2024
by John B. Farmer
© 2024 Leading-Edge Law Group, PLC. All rights reserved.