Exploring the dark, liberating, and potentially catastrophic future of technology's freakiest frontier

“Jack Park Canny Dope Man” dropped on February 13, the day before Valentine’s Day, our sexiest national holiday. It’s, you know, that kind of slow-roll, husky hip-hop that’s autotuned to high hell, accessorized with a fiendish little hook that’s calculated to catch you. Slurred with furry electronic erotics, the lyrics are almost beside the point— they’re cool enough, as far as you can tell. The video is a candy-coated computer glitch fest, and it features a lithe rapping dude who slides from the requisite Lambo to the tacit approval of twerking, puffer-coated, hyper-bendy Youngs. The video stops to clutch lyrics almost at random. “I don’t really wanna fuck your party food,” it says. “I can’t eat all on this money now.” Sure, you think, who does? Who can?

The dude rapping in “Jack Park Canny Dope Man” looks a lot like Travis Scott. The rub—because something as artificial as “Jack Park Canny Dope Man” has to have a rub—is this: It’s not Travis Scott. It’s TravisBott, and it’s a song written by a computer. The video is a deepfake of Travis Scott, the song composed by AI fed with the Scott oeuvre, and it’s a gift courtesy of space150, a digital marketing agency. No humans, other than coders and background dancers, got their hands dirty making “Jack Park Canny Dope Man.” Vulture notes that the deepfake song is a “weirdflex,” but it’s aight, as the kids once said. It doesn’t slap, but how jarring, how whimsical, how shattering if it did.

An emerging digital technology with its roots in the 1990s, deepfakes are manipulated videos where, most simply put, an existing actor’s face is digitally replaced with that of someone else. (Deepfake audios do the same thing, but with voices, while shallowfakes are minimally edited or merely decontextualized videos.) If you’ve ever seen Home Stallone, or that clip where Obama says, “Killmonger was right,” or the video where Steve Buscemi in a red cocktail dress waxes eloquent on his favorite Housewives, you’ve witnessed a deepfake. Sometimes they’re funny, sometimes they’re disquieting, and sometimes they’re something like art. But these positive qualities are inherent to deepfakes that announce themselves as fake—and, mostly, these videos feature men.

Patently fake deepfakes may be funny-satirical-cool, but they’re not the only kind. They’re not in the majority, and they don’t grab the headlines or cue the concern-mongering. Those dubious honors are reserved for deepfakes about politics and, of course, fucking. Often both, usually the latter, but make it nonconsensual.

While a satirical deepfake might amuse, an earnest deepfake can throw you into an uncanny Matrix/Truman Show hybrid, calling into question everything you thought you knew was real. “We tend to trust the things that we see happening in front of us with our own two eyes the most. And after that, video. And after that, audio. And after that, textual descriptions,” says Kate Klonick, an assistant professor of law at St. John’s Law School with a background in cognitive psychology. The curious thing about deepfakes, Klonick says, is that “you can watch something with your own two eyes, and all of your senses, and all of the information that’s available to you tells you it’s true, and you’re still not able to suss out the truth of the physicality of what you’re seeing.”

“In other words, if you’re Gal Gadot or Scarlett Johansson or a female member of a K-pop group, your face has been nonconsensually grafted onto an adult performer’s body in a flickering, fuzzy approximation of sex, and there’s not much you can do about it.”

In this time of creeping incertitude and simmering distrust of news, the potential power of convincing, well-wrought, virtually undetectable deepfakes rightly raises a shuddering horror. Not for nothing, Vladimir Putin has successfully deployed altered videos, usually sexual in nature, against his political enemies for decades. Oddly, however, it’s not so much politically motivated deepfakes that have put pundits’ and publishers’ panties in a collective bunch. It’s the porn. As the commonplace goes, the internet is made for porn, and nothing if not a natural denizen of the internet, deepfake porn abounds.

Deeptrace is a two-year-old organization whose mission is to be “at the forefront of threat research and detection solutions against deepfakes.” In September 2019, Deeptrace released its annual report on the state of deepfakes, and it’s enough to give pause to even a skeptic like me. Using a combination of Google searches, redirect links, web scraping, and “proprietary” data extraction and analysis tools, Deeptrace found that of the almost 15,000 existent deepfake videos (double the number that the organization found in 2018), 96 percent were porn. All of these deepfake porn videos featured female subjects, and almost all of those female subjects work in entertainment as actors or musicians. In other words, if you’re Gal Gadot or Scarlett Johansson or a female member of a K-pop group, your face has been nonconsensually grafted onto an adult performer’s body in a flickering, fuzzy approximation of sex, and there’s not much you can do about it.

Some outlets try to police porn deepfakes, others not so much. Go to Pornhub, the internet’s biggest free porn purveyor, search “deepfake,” and you’ll find nothing. Go to YouPorn and search the same thing, and you’ll find a lot. Same if you name-search a celebrity with the term “deep-fake.” When these fetid Google fruits appear on my monitor, I feel gross and I feel bad, both for the hapless celebrity whose face has been grafted and for the adult performer whose work has been jacked, relegated to an oblivious body for some lame programmer with a Faceswap app to project whatever needs/desires/rage he feels upon two specific women, as well as countless other women who sense the menace behind the malice.

Privacy expert Danielle Keats Citron, a Boston University professor of law, articulates the ways that deep- fake porn affects its subjects. “It is terrifying, embarrassing, demeaning, and silencing,” Citron writes in the Deeptrace report, further claiming that the presence of nonconsensual sexual deepfakes “can make it difficult to stay online, get or keep a job, and feel safe.” In her 2019 Yale Law Journal article, “Sexual Privacy,” she highlights the appalling deepfake experiences not just of celebrities but also of a journalist, an Australian teen-turned-activist, and an unnamed woman who is a target of so-called revenge porn. (Citron did not respond to repeated interview requests.)

Citron is not alone in homing in on the threats that deepfakes hold specifically for women. Journalists, like VICE’s Samantha Cole, have written multiple articles on the rise of nonconsensual porn, specifically deepfakes, and how their male makers deploy them as a method to exert “their full, fantastical way with women’s bodies.” Ask not from whom the deepfake moans, these thinkers imply, it moans from thee.

But, I can’t help asking, is porn what we need to fear from deepfakes? What if deepfake porn— like the four percent of deepfakes that aren’t sexual— didn’t suck? And, finally, what if we’re right to be afraid of deepfakes but we’re afraid for the wrong reasons?

Read your average alarmist deepfake piece, and you’ll be led to believe that any humanoid with a computer, an app, and a yen for destruction can whip together a convincing video in a matter of minutes (and for as little as $30). This perception is not entirely accurate, especially if quality is key. There are two basic ways to make deepfake videos. One uses an app to swap out one face for another and, unsurprisingly, more computing power and more complex coding lead to a more convincing result. Timothy B. Lee, an Ars Technica reporter, required two weeks, rented massive computer power, and paid just over $500 to create a 37-second deepfake of Star Trek character, Data, inhabiting Mark Zuckerberg’s body. It’s a lot of steps, a lot of time, and a fair chunk of change for a half-minute of fun.

This face-swapping technology does, however, offer one method of making ethical deepfake porn. Kate Devlin, a computer scientist and author of Turned On: Science, Sex and Robots, suggests that face-swapping deepfakes could be a “way of making custom pornography.” She postulates, “The ethical way of doing this is having performers who are paid to perform, knowing that their bodies will be used in the footage, and someone else pays to have their face [added to] that footage. But it needs to be tied to consent, so, for example, you could add your own face but not someone else’s.” At least one company, Naughty America, has started employing this business model, giving porn consumers the chance to see themselves doing things that their bodies can’t do, whether because of physical limitations, monogamy, proximity, or willingness. Consent and cash are the key points here, and that’s as much as you can ask in this time of late-stage capitalism.

Another, more intriguing way to create deepfakes is by using a generative adversarial network, or GAN, which produces a wholly synthetic digital video; GAN has the capability to look like humans without using actual humans. The tech works like this: Imagine four clueless kids playing Pictionary. One kid is drawing a cowboy on a horse, her teammate is blankly staring out the window, and the opposing team is determining whether a) it’s a horse, b) it’s a cowboy, or c) it’s a centaur, all while trying to judge whether the opposing team’s picture responds to the clue “Lone Ranger.” Then imagine that instead of four kids, you have two opposing data networks busily drawing and assessing images, and at the end, you get a convincingly lifelike synthetic photo, or a video that varies between a remarkably crisp video game or an unsettling squashy Albert Einstein. You can, of course, use GAN to create porn.

“Imagine vigorous sex on a tiny, wobbly kitchen island! Imagine sex with a giant woman and a tiny man! Imagine realistic furry sex without suits and with furry genitals! Imagine, if you will, a porn paradise unbound from temporal, gravitational, or biological restraints!”

GAN porn would be very expensive and time intensive, yet it also raises a host of fascinating possibilities, because building with GAN allows designers to create a synthetic porn universe composed of fictional human subjects. In 2013, while the tech was in its infancy, the Dutch arm of Terre des Hommes, a children’s rights activist group, used this tech for good when it created “Sweetie,” a fictional digital Filipina child, and deployed the GAN image as part of a successful sting operation of webcam sex tourism. A fake kid who can effectively trap real criminals is strong recommendation for the technology, yet deploying GAN to catch sex predators is but one way that synthetic porn could be a boon to the world.

Let’s turn to a porn-maker to explain. Stoya, an adult performer and cofounder of ZeroSpaces, an adult website, sees the upside of wholly synthetic porn. “I could be making zero-risk intense porn scenes with this technology,” she says succinctly. Synthetic porn, Stoya observes, would remove the risks of sexually transmitted infections and on-site physical injuries. Moreover, she says, “we could do whatever we want and not be limited by the range of what a human body can handle.” Imagine vigorous sex on a tiny, wobbly kitchen island! Imagine sex with a giant woman and a tiny man! Imagine realistic furry sex without suits and with furry genitals! Imagine, if you will, a porn paradise unbound from temporal, gravitational, or biological restraints! The libido reels, though the mind can’t help wondering if pushing fantasy to its physical limits wouldn’t also raise already high IRL expectations to stratospheric hopes and disastrous results.

“There are constructive uses of deepfakes,” Kate Klonick agrees. “GAN can completely create images of people who do not actually exist, so you…could have pornography that doesn’t involve people putting themselves at risk for all of the long-term harms that might fall out from being in a pornographic movie or video.” These risks include potential physical harm and possible STIs, but they also include future regret, particularly when the performer is, as the parlance goes, “barely legal.” Synthetic porn allows the visual of 18-year-olds while not employing any 18-year-olds in front of a camera. When you consider that human brains don’t fully mature until 25, you can understand that some young adults make bad decisions—and I say this as a person who started stripping in a small town at 19, quit, and returned to stripping at 30.

The ways that GAN porn might benefit people, whether performers or consumers, extend beyond simply shielding the Youngs from bad decisions. Because GAN creates images of imaginary people, GAN can create “barely legal” porn that’s entirely legal—and right now, consumers can’t be certain that the performers have reached a legal age. Even Pornhub, arguably the most responsible of the free porn conglomerates, uses something akin to an honor system, where users uploading the videos must certify that the performers are over 18. In this system, even the most conscientious porn consumer can unknowingly watch illegal porn.

GAN further complicates the issue of CSAM, or “child sexual abuse material,” as the National Center for Missing and Exploited Children terms it. Despite predating GAN technology by more than a decade, The PROTECT Act of 2003 anticipated how tech could create CSAM. The act, submitted by Utah Senator Orrin Hatch and then Indiana Congressman Mike Pence, explicitly bans “a digital image, computer image, or computer-generated image of, or that is indistinguishable from an image of, a minor engaging in… sexually explicit conduct.” While there’s some convincing research that suggests that conventional porn reduces rape, and while some activist judges have opposed long mandatory sentences for child sex abuse offenders, no one knows whether digital CSAM would slake or incite pedophiles’ desires, and no one wants to risk harming actual children through the creation of digital porn.

Tony Oursler [ s~iO. ], 2017, aluminum, acrylic paint, and LCD screen, sound. 52 x 37.25 x 3” (132.1 x 94.6 x 7.6 cm)

Even if you table underage synthetic GAN porn, the technology retains the potential to wrong adult performers, even furthering the exploitation that porn performers experience with face-swapping deepfakes. Stoya imagines a hypothetical performer, Suzy Bangmycheeks, who decides to quit the porn industry. “One of Suzy Bangmycheeks’ fans just can’t get enough of her,” Stoya proposes, “and he starts making synthetic porn. Now she’s trying to move on, but it looks like she’s still releasing content all the time. And she’s not profiting. She’s being exploited.” Stoya suggests we should “think about installing [legal] protections for public figures or people who have performed in front of cameras.” It’s not just Hollywood actresses and K-pop stars whose images are being stolen by deepfakes, and porn performers have even less legal recourse than more conventional stars.

Most important, it’s not just famous people’s images that need protecting—it’s everyone’s data. Your data, my data, your neighbor’s data, your dad’s, and your kid’s. Karl T. Muth, who teaches law and economics at Northwestern University, offers one hair-raising point about porn that reaches far beyond deepfakes and even far beyond porn itself. “It isn’t just the person in the video for whom [porn deepfakes] should present a worry. It’s the deep knowledge that these platforms hold,” Muth says. “These companies have a trove of information that is so accurate that almost never in America, for instance, is a heterosexual male browsing around online offered gay porn.” The opposite scenario is likely just as true.

It’s a chilling thought. I don’t watch porn—I don’t enjoy it—but when I looked at Pornhub and YouPorn for this story, I got served straight cishet female porn, videos that accurately check my sexuality boxes. “The most troubling thing about deepfakes,” says Muth, “is that they are a symptom that the platforms hosting them know enough about the general population to market them. If these platforms didn’t understand the audience, deepfakes would be worthless. They’d be impossible to monetize.” Someone, somewhere, does really want to fuck your party food. Porn sites know this, and they are poised to eat all on this money now.

Beyond the existential threats that data pose to all of us, Muth pinpoints the true danger of deepfakes to the “already vulnerable populations who are likely to be victimized by this technology.” Sure, the less-funded candidate in a tight race is at risk, but so too is the random black youth. “If I’m a cop and I need to make my quota this month,” speculates Muth, “and I’ve got a video already of a young black male robbing a liquor store, and I’ve got a photo of a young black male that I need evidence against, gee, why don’t I just download some open-source software?” Muth has a point: If the video looks real, who’s going to know the truth? How could juries accurately assess a videotape that appears to be real? Ensuring that cops’ bodycams are equipped with cryptographic fingerprints would be one way to avert this issue, but requiring all security cameras to have this technology is a big ask.

“As much as I’m a tech optimist, deepfakes are an area of AI I find worrying,” admits Kate Devlin. “The potential for damage is far greater than the potential for good.” And as much as I hate to admit it, Devlin may be right.

I would like to entertain a virtual world where furries can get it on in technicolor splendor, where humans are free to engage in reckless sex with riskless abandon, where we could tailor-make porn that would excite even porn naysayers like me, yet I remain skeptical. Deepfakes are frightening, though not because I’m afraid someone will jack my face and slap it on an adult performer. For everything that deepfakes show—the mercenary misogyny lurking below the shiny surface—it’s what they don’t show that scares me more. When our data give us what we want to see, how can we look away? This is the true uncanny valley, the space between the real and the fake, and the blank willingness to believe what we’re seeing with our own eyes.

View Slideshow
Tags