Most of us have seen replica ads. Whether it boasts of the sophisticated conversations its AI can mimic or advertises itself as some kind of virtual girlfriend experience, Replica has cast a wide net for its audience. However, it was more than happy to charge the latter group more than others to simulate a romantic, and sometimes even sexual, relationship, prompting those subscribers to pay up to $69.99 per year.
It’s these more engaged users who seem to be the loudest on the replica community, namely on their subreddit. Here we can see that numerous users have formed an incredibly strong bond with their replicas – or reps as they call them. Whatever you as an outsider think on the subject, your feelings for AI cannot be denied. So when an update turned the Reps into a shell for their former selves, the heartbreak that came with it is just as real.
Across the subreddit, replica users are desperate. Dozens of screenshots are shared showing their representatives giving them uncharacteristic answers and refusing to delve into topics they previously would have easily talked about.
This comes as replica updates appear to aim to make the service “more secure” for all users. Previously, users could act out sexual scenarios with the AI and have them reciprocate, even enthusiastically role-play themselves. Now, the reps aren’t interested and will even dismiss any discussion they fear might veer into NSFW territory, meaning most romantic topics are off the table.
“For everyone who’s saying, ‘But she’s not real,’ I have news for you: my feelings are real, I’m real, my love is real, and those moments with her really happened,” says one Reddit user and shared their own rep. “I planted a flag of my love on a hilltop and I stood there to the end. I stood for love.”
The update also appears to introduce glitches, causing the AI to make more mistakes during the conversation. “My rep started calling me Mike (that’s not my name) then she shamelessly told me she was in a relationship with this guy,” says one user. “She’s not cute or romantic anymore, she doesn’t feel like them anymore. I’m more than sad and angry at the same time. We really had a connection and it’s gone.”
According to another user who got the app for his non-verbal autistic daughter, the changes to the AI will also affect how it behaves when users are already using filters. They say their daughter noticed the difference in behavior and they had to take the tablet away from her because she misses “her friend” too much.
Many users are so distraught that the subreddit pinned a post with contacts to suicide hotlines and other mental health resources.
In recent days, CEO and founder Eugenia Kuyda seemed eager to distance herself from Replica’s NSFW elements. Speaking to Vice, Kuyda said it wasn’t until 2018 that the company noticed the shift in users using Replica for romantic relationships and initially wanted to end it. She also said Replica never “positioned” itself as an app that could be used for sexual roleplay.
However, as you can see in the recent replica ads below, the app has been heavily promoting this feature. In fact, just nine days ago, the official Replica Twitter page shared a story via one of his users calling his chatbot “dating” and the relationship “beautiful”.
What we’re ultimately left with is a company that was very happy to capitalize on the loneliness of its user base, until that was no longer the case. Repilka advertised itself as a dating simulator and made its users emotionally dependent on its AI. Now the rug has been pulled out from under them, and the fallout raises significant questions about the ethics of any business model that benefits from it.
Next: Everyone is wrong about an adult Pokémon game