What is deepfake porn and why is it thriving in the age of AI?

Doctoral candidate in the Annenberg School for Communication Sophie Maddocks addresses the growing problem of image-based sexual abuse.

Sophie Maddocks.
Sophie Maddocks is a doctoral candidate in the Annenberg School for Communication. (Image: Courtesy of Annenberg School for Communication)

With rapid advances in AI, the public is increasingly aware that what you see on your screen may not be real. ChatGPT will write everything from a school essay to a silly poem. DALL-E can create images of people and places that don’t exist. Stable Diffusion or Midjourney can create a fake beer commercial—or even a pornographic video with the faces of real people who have never met.

So-called “deepfake porn” is becoming increasingly common, with deepfake creators taking paid requests for porn featuring a person of the buyer’s choice and a plethora of fake not-safe-for-work videos floating around sites dedicated to deepfakes.

The livestreaming site Twitch recently released a statement against deepfake porn after a slew of deepfakes targeting popular female Twitch streamers began to circulate.

Annenberg School for Communication doctoral candidate Sophie Maddocks studies image-based sexual abuse, like leaked nude photos and videos and AI-generated porn.

Deepfake porn, according to Maddocks, is visual content created using AI technology, which anyone can access through apps and websites. The technology can use deep learning algorithms that are trained to remove clothes from images of women, and replace them with images of naked body parts. Although they could also “strip” men, these algorithms are typically trained on images of women.

“The rise of AI porn adds another layer of complexity to this, where new technologies like Stable Diffusion create fake porn images,” says Maddocks. “These synthetic sexual images are AI-generated, in that they are not depicting real events, but they are trained on images of real people, many of which are shared non-consensually. In online spaces, it is difficult to disentangle consensually from non-consensually distributed images.”

She adds, “creating fake erotic images is not inherently bad; online spaces can be a great way to explore and enjoy your sexuality. However, when fake nude images of people are created and distributed without their consent, it becomes deeply harmful.”

Are researchers or activists proposing ways to combat deepfake porn? “It is difficult to envisage solutions that address deepfake porn without challenging the broader cultural norms that fetishize women’s non-consent,” says Maddocks. “The rise of misogyny online, through which some men perceive themselves to be victims of increasing efforts toward gender equality, creates the conditions for deepfake porn to proliferate as a form of punishment targeted toward women who speak out.”

As for the trajectory of her research, Maddocks says, “I’m increasingly concerned with how the threat of being 'exposed' through image-based sexual abuse is impacting adolescent girls’ and femmes’ daily interactions online. I am eager to understand the impacts of the near constant state of potential exposure that many adolescents find themselves in.”

This story is by Hailey Reissman. Read more at Annenberg School for Communication.