AI Hustlers Stole Women’s Faces to Put in Ads. The Law Can’t Help Them.
15:38 JST, March 29, 2024
Michel Janse was on her honeymoon when she found out she had been cloned.
The 27-year-old content creator was with her husband in a rented cabin in snowy Maine when messages from her followers began trickling in, warning that a YouTube commercial was using her likeness to promote erectile dysfunction supplements.
The commercial showed Janse – a Christian social media influencer who posts about travel, home decor and wedding planning – in her real bedroom, wearing her real clothes but describing a nonexistent partner with sexual health problems.
“Michael spent years having a lot of difficulty maintaining an erection and having a very small member,” her doppelgänger says in the ad.
Scammers appeared to have stolen and manipulated her most popular video – an emotional account of her earlier divorce – probably using a new wave of artificial intelligence tools that make it easier to create realistic deepfakes, a catchall term for media altered or created with AI.
With just a few seconds of footage, scammers can now combine video and audio using tools from companies like HeyGen and Eleven Labs to generate a synthetic version of a real person’s voice, swap out the sound on an existing video, and animate the speaker’s lips – making the doctored result more believable.
Because it’s simpler and cheaper to base fake videos on real content, bad actors are scooping up videos on social media that match the demographic of a sales pitch, leading to what experts predict will be an explosion of ads made with stolen identities.
Celebrities like Taylor Swift, Kelly Clarkson, Tom Hanks and YouTube star MrBeast have had their likenesses used in the past six months to hawk deceptive diet supplements, dental plan promotions and iPhone giveaways. But as these tools proliferate, those with a more modest social media presence are facing a similar type of identity theft – finding their faces and words twisted by AI to push often offensive products and ideas.
Online criminals or state-sponsored disinformation programs are essentially “running a small business, where there’s a cost for each attack,” said Lucas Hansen, co-founder of the nonprofit CivAI, which raises awareness about the risks of AI. But given cheap promotional tools, “the volume is going to drastically increase.”
The technology requires just a small sample to work, said Ben Colman, CEO and co-founder of Reality Defender, which helps companies and governments detect deepfakes.
“If audio, video, or images exist publicly – even if just for a handful of seconds – it can be easily cloned, altered, or outright fabricated to make it appear as if something entirely unique happened,” Colman wrote by text.
The videos are difficult to search for and can spread quickly – meaning victims are often unaware their likenesses are being used.
By the time Olga Loiek, a 2o-year-old student at the University of Pennsylvania, discovered she had been cloned for an AI video, nearly 5,000 videos had spread across Chinese social media sites. For some of the videos, scammers used an AI-cloning tool from the company HeyGen, according to a recording of direct messages shared by Loiek with The Washington Post.
In December, Loiek saw a video featuring a girl who looked and sounded exactly like her. It was posted on Little Red Book, China’s version of Instagram, and the clone was speaking Mandarin, a language Loiek does not know.
In one video, Loiek, who was born and raised in Ukraine, saw her clone – named Natasha – stationed in front of an image of the Kremlin, saying “Russia was the best country in the world” and praising President Vladimir Putin. “I felt extremely violated,” Loiek said in an interview. “These are the things that I would obviously never do in my life.”
Representatives from HeyGen and Eleven Labs did not respond to requests for comment.
Efforts to prevent this new kind of identity theft have been slow. Cash-strapped police departments are ill-equipped to pay for pricey cybercrime investigations or train dedicated officers, experts said. No federal deepfake law exists, and while more than three dozen state legislatures are pushing ahead on AI bills, proposals governing deepfakes are largely limited to political ads and nonconsensual porn.
University of Virginia professor Danielle Citron, who began warning about deepfakes in 2018, said it’s not surprising that the next frontier of the technology targets women.
While some state civil rights laws restrict the use of a person’s face or likeness for ads, Citron said bringing a case is costly and AI grifters around the globe know how to “play the jurisdictional game.”
Some victims whose social media content has been stolen say they are left feeling helpless with limited recourse.
YouTube said this month that it was still working on allowing users to request the removal of AI-generated or other synthetic or altered content that “simulates an identifiable individual, including their face or voice,” a policy the company first promised in November.
In a statement, spokesperson Nate Funkhouser wrote: “We are investing heavily in our ability to detect and remove deepfake scam ads and the bad actors behind them, as we did in this case. Our latest ads policy update allows us to take swifter action to suspend the accounts of the perpetrators.”
Janse’s management company was able to get YouTube to quickly remove the ad.
But for those with fewer resources, tracking down deepfake ads or identifying the culprit can be challenging.
The fake video of Janse led to a website copyrighted by an entity called Vigor Wellness Pulse. The site was created this month and registered to an address in Brazil, according to Groove Digital, a Florida-based marketing tools company that offers free websites and was used to create the landing page.
The page redirects to a lengthy video letter that splices together snippets of hardcore pornography with cheesy stock video footage. The pitch is narrated by an unhappily divorced man who meets a retired urologist turned playboy with a secret fix to erectile dysfunction: Boostaro, a supplement available to purchase in capsule form.
Groove CEO Mike Filsaime said the service prohibits adult content, and it hosted only the landing page, which evaded the company’s detectors because there was no inappropriate content there.
Filsaime, an AI enthusiast and self-described “Michael Jordan of marketing,” suggested that scammers can search social media sites to apply popular videos for their own purposes.
But with fewer than 1,500 likes, the video stolen from Carrie Williams was hardly her most popular.
Last summer, the 46-year-old HR executive from North Carolina got a Facebook message out of the blue. An old friend sent her a screenshot, asking, “Is this you?” The friend warned her it was promoting an erectile enhancement technique.
Williams recognized the screenshot instantly. It was from a TikTok video she had posted giving advice to her teenage son as she faced kidney and liver failure in 2020.
She spent hours scouring the news site where the friend claimed he saw it, but nothing turned up.
Though Williams dropped her search for the ad last year, The Post identified her from a Reddit post about deepfakes. She watched the ad, posted on YouTube, for the first time last week in her hotel room on a work trip.
The 30-second spot, which discusses men’s penis sizes, is grainy and badly edited. “While she may be happy with you, deep down she is definitely in love with the big,” the fake Williams says, with audio taken from a YouTube video of adult film actress Lana Smalls.
After questions from The Post, YouTube suspended the advertiser account tied to the deepfake of Williams. Smalls’s agent did not respond to requests for comment.
Williams was taken aback. Despite the poor quality, it was more explicit than she feared. She worried about her 19-year-old son. “I would just be so mortified if he saw it or his friend saw it,” she said.
“Never in a million years would I have ever, ever thought that anyone would make one of me,” she said. “I’m just some mom from North Carolina living her life.”
"News Services" POPULAR ARTICLE
-
North Korea Long-Range Ballistic Missile Test Splashes Down between Japan and Russia (UPDATE 1)
-
Japan’s Nikkei Stock Closes at 2-week Peak as Tech Shares Track Nasdaq Higher (Update 1)
-
Nissan Plans 9,000 Job Cuts, Slashes Annual Profit Outlook
-
Iran Arrests Female Student Who Stripped to Protest Harassment
-
Chinese Solar Firms Go Where US Tariffs Don’t Reach
JN ACCESS RANKING
- Streaming Services Boost Anime Popularity Overseas; Former ‘Geeky’ Interest More Beloved Among Gen Z than 3 Major U.S. Sports
- Malaysia Growing in Popularity as Destination for Studying Abroad; British-style Education Available at Low Cost
- ‘Women Over 30 Would Have Uteruses Removed’; Remarks of CPJ Leader, Novelist Naoki Hyakuta Get Wide Attention
- Japan Business Circle Calls for China Resuming Visa-Free Travel; Keizai Doyukai Visit to Country Marks 1st in 8 Years
- Typhoon Kong-rey Expected to Turn into Tropical Storm after Possible Pass Over Taiwan