Grim.
Dressed Down
AI image generators that claim the ability to "undress" celebrities and random women are nothing new — but now, they've been spotted in monetized ads on Instagram.
As 404 Media reports, Meta — the parent company of Facebook and Instagram — contained in its ad library several paid posts promoting so-called "nudify" apps, which use AI to make deepfaked nudes out of clothed photos.
In one ad, a photo of Kim Kardashian was shown next to the words "undress any girl for free" and "try it." In another, two AI-generated photos of a young-looking girl sit side by side — one with her wearing a long-sleeved shirt, another appearing to show her topless, with the words "any clothing delete" covering her breasts.
Over the past six months, these sorts of apps have gained unfortunate notoriety after they were used to generate fake nudes of teen girls in American schools and Europe, prompting investigations and law proposals aimed at protecting children from such harmful AI uses. As Vice reported at the end of last year, students in Washington said they found the "undress" app they used to create fake nudes of their classmates via TikTok advertisements.
Why go overseas for a nudify tool to exploit teen girls in your school when you can get them on Instagram? pic.twitter.com/Yrvw4r8F7t
— SwiftOnSecurity (@SwiftOnSecurity) March 30, 2024
Takedown Request
In its investigation, 404 found that many of the ads its reporters came across had been taken down from the Meta Ad Library by the time they checked it out, while others were only struck down once it alerted a company spokesperson to their existence.
"Meta does not allow ads that contain adult content," the spokesperson told the website, "and when we identify violating ads we work quickly to remove them, as we’re doing here."
Others still, however, were still up when 404 published its story, suggesting that like with so many content enforcement efforts, Meta is taking a whac-a-mole approach to banning these sorts of ads even as others crop up.
Last summer, Futurism found that Google was readily directing searchers to deepfake porn that not only featured celebrities spoofed into nude photos, but also of lawmakers, influencers, and other public figures who didn't consent to such usage of their images. When doing a cursory search, Google still showed "MrDeepFakes," the biggest purveyor of such content, first when searching for "deepfake porn."
During its investigation, 404 found that one of the apps in question both prompted users to pay a $30 subscription fee to access its NSFW capabilities and, ultimately, was not able to generate nude images. Still, it's terrifying that such things are being advertised on Instagram at all, especially considering that 50 percent of teens, per a Pew poll from last year, still report daily usage of the Meta-owned app.
More on deepfakes: AI-Powered Camera Takes Pictures of People But Instantly Makes Them Naked
Share This Article