The ads were part of a massive campaign this week for a deepfake app, which allows users to swap any face into any video of their choosing.
Commonly, deepfake creators make videos in which celebrities are made to look like they are willingly appearing in them, even though they are not. Increasingly, the technology has been used to make nonconsensual pornography featuring the faces of celebrities, influencers or any person.
On Sunday and Monday, an app for creating “DeepFake FaceSwap” videos rolled out more than 230 ads on Meta’s services, including Facebook, Instagram and Messenger, according to a review of Meta’s ad library.
Some of the ads showed what looked like the beginning of pornographic videos with the well-known sound of the porn platform Pornhub’s intro track playing. Seconds in, the women’s faces were swapped with those of famous actors.
“This could be used with high schoolers in public schools who are bullied,” said Lauren Barton, a journalism student in Tennessee. “It could ruin somebody’s life. They could get in trouble at their job. And this is extremely easy to do and free. All I had to do was upload a picture of my face and I had access to 50 free templates.”
Of the Meta ads, 127 featured Watson’s likeness. Another 74 featured the actor Scarlett Johansson’s face swapped with those of women in similarly provocative videos. Neither actor responded to a request for comment.
While no sexual acts were shown in the videos, their suggestive nature illustrates how the application can potentially be used to generate faked sexual content. The app allows users to upload videos to manipulate and also includes dozens of video templates, many of which appear to be taken from TikTok and similar social media platforms.
The terms of service for the app, which costs $8 per week, say it does not allow users to impersonate others via their services or upload sexually explicit content. The app developer listed on the App Store is called Ufoto Limited, owned by a Chinese parent company, Wondershare. Neither company responded to a request for comment.
While the nonconsensual sharing of sexually explicit photos and videos is illegal in most states, laws addressing deepfake media are in effect only in California, Georgia, New York and Virginia.
Not sharing the pictures or videos here!
source.