Sex workers have been fighting for control over their imagery since the rise of the internet—and with the boom in AI-generated porn, everyday people have joined the fray

From hentai animations to 37,000-year-old cave drawings of boobs, people have always made up things to jerk off to. “If you’ve ever closed your eyes and thought about something other than what is directly in front of you while having sex or masturbating, you’ve essentially jerked off to AI porn,” says sex worker and writer Liara Roux, explaining that the problem is not the technology itself, but the fact that people’s images are being used without their consent—and it’s happening not only to celebrities, but also to everyday people.

The topic of deepfakes exploded back into the public consciousness this week when several popular Twitch streamers found out that their likeness had been used to create and distribute pornographic videos without their permission—a fact they weren’t aware of until a fellow streamer, Brandon “Atrioc” Ewing, was caught watching them. This led Atrioc to go live in a tearful apology video, in which his wife sobs in the background as he admits to consuming deepfake porn starring popular female streamers—many of whom had, until this point, considered him a friend.

For the women involved, finding out that their fake porn videos had been uploaded to Pornhub was a violation they could never have prepared for. “This story was how I found out that I’m on this website,” one of the women, Sweet Anita, tweeted. “I literally choose to pass up millions by not going into sex work, and some random Cheeto-encrusted porn addict solicits my body without my consent instead.”

Even for people who choose to release online pornography, combatting stolen content is an uphill battle—one porn performers have been fighting since the advent of streaming sites like Redtube, Xtube, and Pornhub made it easy for users to upload pornographic videos without the permission of their creators. Now, with the rise of deepfakes and AI-generated imagery, celebrities and civilians are faced with the same problem—bringing attention to an issue that has long plagued sex workers, and raising questions about the systemic safeguards required to curb the negative effects of today’s AI boom.

QTCinderella, one of the streamers whose image was used in a pornographic deepfake, spoke out against the use of deepfakes in an emotional livestream. “If you are able to look at women who are not benefiting off of being seen sexually—if they’re not benefiting and they’re not platforming it themselves—if you are able to look at that, you are the problem, and you see women as an object,” she says in the video, confronting the camera with tears streaming down her face.

“People go, ‘Oh, it doesn’t matter if you rip off porn, it doesn’t matter if you deepfake someone’s face and put it on a porn star’s body’—because the sex worker is essentially considered to be ‘asking for it’ by participating in sex work.”

The video sparked heated debate about deepfakes on Twitter, which many describe as nothing short of “virtual rape.” But while there has been much sympathy directed toward those whose faces have been used to create deepfake pornography, less attention has been paid to the emotional harm experienced by the real-life porn performers whose bodies are co-opted without permission. “It’s so thoroughly dehumanizing of sex workers to have their bodies literally objectified and turned into the extension of some celebrity,” says Roux.

Porn director and performer Vex Ashley agrees: “A lot of this goes back to the disposability or the perceived disposability of sex workers and the content we create,” she says. “People go, ‘Oh, it doesn’t matter if you rip off porn, it doesn’t matter if you deepfake someone’s face and put it on a porn star’s body’—because the sex worker is essentially considered to be ‘asking for it’ by participating in sex work. Anyone can do anything to you, basically, because of the way that society views you.” This is particularly frustrating for Ashley, whose independent porn project, Four Chambers Cinema, is dedicated to exploring the aesthetic and conceptual potential of pornography as an artistic medium. But because sex, and sex work, is considered “culturally valueless” in society, she says it’s not afforded the same respect or protections as other forms of creative expression.

That’s not to say artists aren’t being stolen from, too. The surging popularity of text-to-image generators like Stable Diffusion has led to new debates about intellectual property rights, with artists and stock image suppliers alike filing lawsuits against the makers of these tools, which they claim are intentionally profiting off the work of human artists without appropriate compensation. The problem with AI-generated imagery lies in the fact that in order to create something “new,” machine learning models must be trained on massive data sets of existing images—meaning that in the case of NSFW images, the faces and bodies of real people are being used to generate artificial porn, and they’re not getting paid for it. Equally troubling is the fact that current diffusion-based AI models have been found to reproduce copyrighted images from its data set—so it’s possible that speciality image generators like Unstable Diffusion, which was trained on a variety of erotic imagery, could reproduce recognizable images of people’s faces and bodies without their permission.

“In order to create something ‘new,’ machine learning models must be trained on massive data sets of existing images—meaning that in the case of NSFW images, the faces and bodies of real people are being used to generate artificial porn, and they’re not getting paid for it.”

While many artists, content creators, and sex workers have spoken out against the dangers of deepfake technology, others have attempted to preempt it. In 2021, musician Holly Herndon created and licensed her own vocal deepfake, Holly+—meaning that others could create songs using her voice, but that she would get a cut of the profits. “Vocal deepfakes are here to stay,” Herndon said in a press release about the project, adding that a balance needs to be found between encouraging people to experiment with new technologies and protecting artists from exploitation. She cites the advent of sampling as an example of how the popularization of new technologies often predates the infrastructure required to safeguard artists from exploitation, with devastating results—for example, the most-sampled piece of music in modern history, Gregory Coleman’s “Amen Break,” never earned him a cent.

According to Herndon, blockchain infrastructure can offer new opportunities for creators to track and monetize the use of their work via smart contract, a simple program designed to automate the execution of an agreement—for example, automatically recording and monetizing the use of her vocal deepfake. Some sex workers are already turning to blockchain-based frameworks like SpankChain, a decentralized social network for the adult industry that utilizes integrated payment systems to bypass the issues sex workers often face with credit card companies.

Roux thinks the prospect of using these systems to record and monetize the use of one’s image is promising, but isn’t a substitute for passing laws to defend the rights of artists and content creators, which could also help prevent everyday people from having their likeness used without consent. “There needs to be a framework built around this, because so many things we considered impossible are now suddenly very real,” she says. “A lot of living people are now having their images used by these AI-generating tools, which are making money off of their work, and they’re not earning a dollar. Solidarity between gig economy workers is the most important part of the AI porn debate, because the laws that could be passed to protect sex workers would also protect all forms of content creators from having their images stolen.”

She’s quick to point out that while the technology itself may be new, the story of companies building their reputation on the backs of sex workers and artists is not. It’s been illustrated time and time again by platforms like Tumblr and OnlyFans, both of which moved to ban adult content after their platform was popularized by it. “At the end of the day, this boils down to a labor issue, an intellectual property right issue, and a creator compensation issue,” Roux says. “It’s easy to get all worked up about the artificiality of AI porn, but the real problem isn’t new technology—it’s that once again, companies in Silicon Valley are exploiting the very real labor of other people, and profiting off of it.”

Tags