In a Facebook ad, a woman with a face identical to actress Emma Watson smiles sheepishly and leans into the camera, appearing to engage in a sexual act. But the woman is not Watson, the star of «Harry Potter.» The announcement was part of a massive campaign this week for a deepfake app, which allows users to swap any face in any video of their choosing.
Deepfakes are content in which faces or sounds are changed or manipulated. Usually, deepfake creators make videos in which celebrities seem to willingly appear in them, even though they don’t. Increasingly, technology has been used to make non-consensual pornography featuring the faces of celebrities, influencers, or just anyone. including children.
The ad campaign on Meta nods to the fact that this once-advanced technology has rapidly spread to readily available consumer applications that are advertised across major parts of the Internet. Despite many platforms banning manipulative and malicious fake content, apps like the ones reviewed by NBC News have been able to fly under the radar.
On Sunday and Monday, a video-making app «DeepFake FaceSwap» launched more than 230 ads across Meta services, including Facebook, Instagram and Messenger, according to a review of Meta’s ad library. Some of the ads featured what looked like the beginning of pornographic videos with the well-known sound from the porn platform. Introduction of Pornhub playback track. Seconds later, the women’s faces were exchanged with famous actresses.
When Lauren Barton, a journalism student in Tennessee, saw the same ad on a separate application, she was shocked enough to screen record it and tweet itwhere it received more than 10 million views, according to Twitter’s hit counter.
“This could be used with high school students in public schools who are bullied,” Barton said. “You could ruin someone’s life, you could get in trouble at work. And this is extremely easy to do and free. All I had to do was upload a photo of my face and I had access to 50 free templates.”
Of the Meta ads, 127 featured Watson’s image. Another 74 showed actress Scarlett Johansson’s face swapped with women in equally provocative videos. Neither actress responded to a request for comment.
“Replace face with anyone,” read the captions on 80 of the ads. «Have fun with AI face changing technology.»
On Tuesday, after NBC News contacted Meta for comment, all ads for the app were removed from Meta’s services.
While no sexual acts were shown in the videos, their suggestive nature illustrates how the app can potentially be used to generate fake sexual content. The app allows users to upload videos for manipulation and also includes dozens of video templates, many of which appear to have been taken from TikTok and similar social media platforms.
The preset categories include «Fashion,» «Girlfriend,» «For Men,» «For Women,» and «TikTok,» while the category with the most options is called «Hot.» It features videos of scantily clad women and men dancing and posing. After selecting a video template or uploading their own, users can enter a single photo of anyone’s face and receive a face-swapped version of the video in seconds.
He Terms of Service for the app, which costs $8 per week, says it doesn’t allow users to impersonate others through its services or upload sexually explicit content. The app developer listed on the App Store is called Ufoto Limited, owned by a Chinese parent company, Wondershare. Neither company responded to a request for comment.
Forbidden goal most fake content in 2020, and the company prohibits adult content in advertisements, including nakednessdepictions of people in explicit or suggestive positions, or activities that are sexually provocative.
“Our policies prohibit adult content, regardless of whether it is AI-generated or not, and we have restricted this page from advertising on our platform,” a Meta spokesperson said in a statement.
The same ads were also seen on free photo-editing apps and games downloaded from Apple’s App Store, where the app first appeared in 2022 and is still available to download for free for ages 9 and up.
An Apple representative said the company doesn’t have specific rules on deepfakes, but it does ban apps that include pornography and defamatory content. Apple said it removed the app from the App Store after being contacted by NBC News.
The app is also on Google Play and is rated «Teen» for «Suggestive Themes».
Apple and Google have taken action against similar AI face-swapping apps, including a different app that was the subject of a Reuters investigation in December 2021. Reuters found that the app advertised the creation of «fake porn» on porn websites. At the time, Apple said it had no specific guidelines on deepfake apps, but that it prohibited content that was defamatory, discriminatory, or likely to intimidate, humiliate, or harm someone. While its ratings and ad campaigns have been adjusted, the app Reuters reported on is still available to download for free on the Apple App Store and Google Play.
The app NBC News reviewed is one of the latest in a boom in freely available consumer counterfeit products.
Searching for «deepfake» in app stores turns up dozens of apps with similar technological capabilities, including some that promote the creation of «hot» content.
Conventional examples of technology show celebrities and politicians doing and saying things that they have never actually said or done. Sometimes the effects are comical.
However, deepfake technology has been overwhelmingly used to make pornography with non-consenting stars. As technology has improved and become more widespread, the market for non-consensual sexual images has exploded. Some websites allow users to sell fake pornography without consent from behind a paywall.
A 2019 report from Deeptrace, an Amsterdam-based company that monitors synthetic media online, found that 96% of fake material online is pornographic in nature.
In January, female Twitch streamers spoke out after a popular male streamer apologized for consuming fake pornography of his peers.
Livestreaming research by independent analyst Genevieve Oh found that the leading website for consuming fake pornography exploded in traffic following the Twitch streamer’s apology. Oh’s research also found that the number of fake porn videos has nearly doubled every year since 2018. February had the most fake porn videos uploaded in a month, Oh said.
While the non-consensual sharing of sexually explicit photos and videos is illegal in most stateslaws addressing fake deep media are indeed only in California, Georgia, New York and Virginia.