The framework for the project, dubbed ‘GirlfriendGPT,’ is now available online, spurring concerns about consent in the era of customized chatbots

In the 2014 film Her, Theodore (Joaquin Phoenix), a heartbroken man, falls in love with Samantha: an artificially-intelligent personal assistant who wins his heart with her wit and charm, despite her lack of corporeal form. Ten years later, this dystopian future seems to have arrived—only now, men are not only falling in love with AI-generated girlfriends, but also creating GPT-powered clones of their real ones. Or, at least, that’s what a developer named Enias Cailliau did.

“I’m interested in the technical challenge of making an AI that feels personal—that I’d really want to talk with, the way I talk to friends in real life,” Cailliau told Vice. He created his bot by combining a customized large language model with text-to-speech software; then, he coded a selfie tool to generate images using Stable Diffusion technology, connecting it all to the messaging app Telegram, through which it can text, send voice notes, and even video chat. Cailliau chose to publish the project’s source code online—effectively providing free global access to a build-your-own girlfriend program.

Cailliau takes his cue from the recent rise of AI chatbots, bent on replicating the personalities of public figures—most of which are closed-source and pay-to-play, unlike GirlfriendGPT. Last month, for instance, the 23-year-old Snapchat influencer Caryn Marjorie teamed up with Forever Voices AI to create CarynAI, a voice-based “virtual girlfriend” that fans can chat and even sext with for $1 per minute. The program, which was trained on thousands of hours of the influencer’s YouTube content, quickly went viral, raking in $70,000 in earnings over the course of just one week. Marjorie, who considers the project an “extension of her consciousness,” now has a whopping 17,000 “boyfriends” she’s never met.

“I’m interested in the technical challenge of making an AI that feels personal—that I’d really want to talk with, the way I talk to friends in real life.”

For Cailliau, cloning his own girlfriend, Sacha Ludwig, was the natural choice: She’s the person whose behavior and likeness he’s most familiar with, making her a good candidate for testing the technology’s limits. The real-life Ludwig, who supports the project, was happy Cailliau wanted to clone her over “some random influencer,” and describes the results as “so cool”—though they both agree the bot still requires some fine-tuning.

The program is still firmly situated in uncanny valley territory, with its flat vocal affect and sometimes disjointed selfies. But despite these limitations, the ease with which Cailliau replicated Ludwig’s likeness—and the blueprint he’s provided for others to do the same—raises questions about potential safeguards required to police the proliferation of interactive, AI-generated clones of celebrities, influencers, and everyday people. At present, there are few legal injunctions granted against deepfakes, unless they fall under the category of explicit pornography, or are trained using copyrighted content. As such technology becomes increasingly accessible—and the market for AI-generated paramours continues to boom—the widespread availability of open-source programs like GirlfriendGPT begs the question: What’s to stop incels from creating an AI-generated clone of “the one who got away,” with or without her consent?

Tags