In an essay for The Atlantic last month, Nina Jankowicz wrote about what it was like to discover that she'd been deepfaked into pornographic material.
"Recently, a Google Alert informed me that I am the subject of deepfake pornography. I wasn't shocked," wrote Jankowicz, the former executive director for the United States' since-disbanded Disinformation Governance Board. "The only emotion I felt as I informed my lawyers about the latest violation of my privacy was a profound disappointment in the technology — and in the lawmakers and regulators who have offered no justice to people who appear in porn clips without their consent."
Jankowicz, like a growing number of people, was recently a victim of a disturbing and rapidly proliferating realm of pornography in which the faces of individuals — often famous celebrities, lawmakers, streamers and influencers, and other public figures, in addition to non-famous people — are inserted into pornographic content using increasingly popular and broadly available AI technologies. Though not technically illegal in most places, this content is invasive, violating, and above all else, non-consensual. (It's also worth noting that while anyone can be impacted by deepfaked porn, it primarily impacts women.)
That in mind, you might imagine that deepfaked porn, like other kinds of violating content, is difficult to find. Or at the very least, takes more than the simplest of Google searches to track it down. Sure, there's a lot of awful stuff online, but it usually takes some shred of effort to find it.
Unfortunately, that couldn't be further from the truth. Google "deepfake porn" and you'll be met with pages of links to sites that offer easy access to it. The same is true for searches on rival engines including Bing and DuckDuckGo.
Worse, as NBC's Kat Tenbarge reported back in March, these sites seem to be making money on the nonconsensual material. And while some sites support themselves by paywalling the nonconsensual content, the first Google result for "deepfake porn," a site called MrDeepFakes, offers its fake porn videos of celebrities and influencers to users for free, appearing to instead cash in on old-fashioned advertising dollars — a revenue model likely made possible by its top-spot Google ranking.
To be clear, these sites are not at all shy about their offerings.
"MrDeepFakes is the best celebrity deepfake porn tube site featuring celeb porn videos and fake nude photos of actresses, YouTubers, Twitch streamers..." reads MrDeepFakes' site description, as visible in Google search.
"Real celebrity porn is hard to come by," reads a separate MrDeepFakes page description, similarly visible in search results, "this is why we make our own."
"The Greatest Deepfake Porn Site Ever Made," another SEO-friendly site, dubbed DeepFucks.com, proclaims in search-visible subtext. "Watch or download the most realistic celebrity deepfake porn videos from the world's best deepfake porn creators."
Of course, considering that deepfaked porn has yet to be explicitly outlawed by most state and federal legislators — stand-out exceptions being efforts in California, Georgia, Virginia, and New York — Google and other search platforms don't technically have to ban this kind of material, even if it has the potential to ruin lives. But tech platforms have their own rules for a reason, and a review of various search content policies shows that deepfaked porn is seemingly in violation of several platform-set guidelines.
Out of the three search platforms in question, Google's policies are undoubtedly the most expansive.
"Google Search encompasses trillions of web pages, images, videos and other content, and the results might contain material that some could find objectionable, offensive or problematic," reads Google's content policy. "Google's automated systems help protect against objectionable material."
"We may manually remove content that goes against Google's content policies, after a case-by-case review from our trained experts," it continues. "We may also demote sites, such as when we find a high volume of policy content violations within a site."
From there, in its "overall content policies for Google Search" — which, per the section's description, applies to "content surfaced anywhere within Google Search, which includes web results" — deepfaked porn is named as a distinct policy violator.
"Google might remove certain personal information that creates significant risks of identity theft, financial fraud, or other specific harms," reads the section, "which include, but aren't limited to, doxxing content, explicit personal images, and involuntary fake pornography."
So Google specifically names nonconsensual deepfake porn as an offender within its broader search policies. It also says that any site that contains large amounts of policy-violating content could be subject to manual demotion in results. And needless to say, MrDeepfakes and its compatriots all archive countless videos that seemingly cross red lines in all the above policies.
Bing owner Microsoft doesn't expressly name deepfaked porn in its services agreement, but it does have extensive policies related to revenge porn. DuckDuckGo similarly doesn't offer specific protection against deepfaked porn in its terms of use, but when we reached out, a spokesperson for the search privacy firm clarified that though it "gets its results from many sources, our primary source for traditional web links is Bing."
In light of all that, then, why are these sleazy sites — which again, are hosting explicit, nonconsensual, and manipulated content — still so ridiculously easy to find on major search engines?
And to that end, what's perhaps most frustrating, besides the fact that this stuff exists in search at all, is the search engines' unanimous focus on victim action — as opposed to taking any sweeping measures on their own behalf.
"We recognize it can be distressing when involuntary fake pornography is discoverable in Google search results," reads a Google request for deepfake removal portal. "This article is intended to support you through the process to request the removal of such content from Google search results."
When we reached out to Google, a spokesperson pointed us to their victim support portal, further noting that their content policies go beyond what's legally required.
"While search engines allow people to find content that is available on the open web, Google Search has policies and systems to help protect people impacted by involuntary fake pornography," the spokesperson told us over email. "People can use our policies to request the removal of pages about them that include this content from Google Search. In addition, we fundamentally design our ranking systems to surface high quality information, and to avoid shocking people with unexpected harmful or explicit content when they aren't looking for it."
Still, most of the information that Google directed us to deals with the policing and removal of individual pages, and not entire domains dedicated to policy-violating material. Porn exists across the internet, and it would be unreasonable to, say, shut all Pornhub results out of Google's index because some pages violate content policy standards. But when a website's whole deal is offering nonconsensual, manipulated pornographic material? That's a very different story. (In defense of their URL-focused system, Google told us that a page-by-page approach to removing harmful content ensures that information that might be helpful for victims, like contact and policy pages, stays available.)
Over email, DuckDuckGo's spokesperson also urged users experiencing any "issues" to submit their own feedback.
"For issues or problematic content, we encourage people to submit feedback directly on our search engine results page," they added. "Our Privacy Policy also points users here. The page outlines users' privacy rights along with a way to get in touch with us."
Bing owner Microsoft declined to provide a statement, with a rep for the company instead directing us to — you guessed it! — their victim support portal.
And sure, on the one hand, it's good that platforms have some kind of system that allows victims to personally address the issue.
While these forms are good to have, though, relying on them as a primary means for countering deepfakes is putting the bar for protection in the basement. Plus, as the public availability and adoption of rapidly advancing AI tools is likely to proliferate, we can only expect the prevalence of deepfaked porn to increase. Is trusting Google Alerts to tell you whether or not someone has used a generative AI-powered tool to swap your face for a porn actor's really the best that tech giants can do?
Don't get us wrong: government regulation of deepfaked content, in addition to AI technologies as a whole, is needed and important.
But at the end of the day, it's uncanny that it's harder to find a pirated movie on search platforms than deepfake porn. The former is a copyright violation; the latter is an express violation of sexual consent with the potential to cause very real harm to a victim's mental and emotional health, well-being, and security. Search engines need to do better — especially given that, in the case of Google especially, some of them are already within their own policy rights to crack down.
More on deepfaked porn: Twitch Streamer Tearfully Apologizes for Looking at Deepfaked Porn
Share This Article