Deepfakes, which have long been used to create explicit content without consent, are now being leveraged to craft a new kind of trauma porn

True crime content, for many who consume it, is a guilty pleasure. Love it or hate it, hearing the graphic details of real-life murders makes people uneasy—especially when most of them are told without the consent of those most affected by the tragedy. But the murky ethics of the genre haven’t stopped Americans from getting hooked: According to a poll conducted by YouGov, nearly half of the country’s population is thought to watch or listen to true crime content, with one-third of respondents doing so more than once a week. And, thanks to AI, things are about to get a whole lot weirder—because TikTok, once known as the app for viral dances, has now become home to deepfakes of murdered children, recounting the details of their own grisly demise.

In these videos, victims’ AI-generated faces are modified, perhaps to dodge TikTok’s recent ban on synthetic content depicting the likenesses of private individuals—or “out of respect for the families,” as some of the accounts claim. But from children locked in ovens to babies stabbed to death, these videos—many of which have accrued millions of views—detail abuse suffered by real victims, exploiting their memories for the sake of clicks. While these videos, posted by accounts like My Criminal Story, are marketed as an engaging new way to listen to true crime stories, they also have the potential to “re-victimize people who have been victimized before,” criminal justice professor Paul Bleakley told Rolling Stone. “Imagine being the parent or relative of one of these kids in these AI videos. You go online, and in this strange, high-pitched voice, here’s an AI image [based on] your deceased child, going into very gory detail about what happened to them.”

Some parents, however, have actually chosen to reanimate their murdered children using the technology. Last year, Alison Cope—the mother of rapper Joshua Emmanuel Ribera, better-known as Depzman—spearheaded the creation of a deepfake music video, in which the murdered rapper describes the night that he was stabbed to death as part of an anti-knife campaign. The difference, of course, is consent—and while it’s obvious that deepfaked videos of murdered children are insensitive to the victims’ families, the fact that their subjects are deceased could make it hard to sue on grounds of defamation.

“The same laws that could provide much-needed protection for victims of revenge porn do little to help the grieving families whose children have fallen victim first to murder, and then to the true crime industrial complex.”

Deepfakes have proven tricky to regulate, with legislators struggling to reconcile individual privacy and well-being with First Amendment rights. For celebrities and public figures, right of publicity laws can be leveraged to protect one’s image and likeness from commercial exploitation, and have recently been extended to cover postmortem impersonations. But when it comes to deepfakes of private individuals, defamation laws are one of the few legal protections available—and may be even harder to argue in court, given that the creators of AI-generated true crime videos often tweak minor autobiographical information while drawing heavily on the details of a specific case.

The problems faced by victims of deepfake pornography may differ, but they exist in the same legal gray area—because while such use of the technology is clearly unethical, there is no federal law that makes it illegal. When it comes to nonconsensual deepfakes, protections vary by state; some, including Virginia and California, have banned deepfake pornography, and a new piece of legislation was recently introduced that would make the sharing of nonconsensual AI-generated pornography illegal in the United States. But the same laws that could provide much-needed protection for victims of revenge porn do little to help the grieving families of those who have fallen victim first to murder, and then to the true crime industrial complex—unless they add a caveat for trauma porn, that is.

Tags