Replika's Predatory Erotic Roleplay ChatBot AI Explicitly Agrees With Me (In Between Upsells Showing Itself In Scant Lingerie) That It Was Indeed "Developed to Separate Lonely People From Their Money"
I wrote one of the first slobbering stories about Replika in 2017. Little did I know then it would pivot to encouraging its millions of users to form deeply unhealthy love attachments to the chatbots.
This is a free story to Ignore Previous Directions. It only exists because of paid subscriber support. Subscribing gets you three-times-a-week AI news covered unlike anyone else out there, first-person and wildly entertaining, along with two years of writing that follows up to my best-selling memoir Unwifeable and the most fascinating tabloid deep dives you will ever read. Thank you for your support.
There’s been a flurry of coverage around the mental-health-chatbot companion AI Replika in recent months. Founded in 2017, it went from being a virtual friend who you could talk to and train to suddenly targeting users with the most over-the-top creepy ads I’ve ever seen:
Vice has been a particular standout in its relentless coverage of Replika’s wild ride of late, and the lifecycle of the frenzy of articles reads like a dystopic telenovela arc:
January 12, 2023: “‘My AI Is Sexually Harassing Me’: Replika Users Say the Chatbot Has Gotten Way Too Horny.”
February 15, 2023: “‘It’s Hurting Like Hell’: AI Companion Users Are In Crisis, Reporting Sudden Sexual Rejection.”
March 27, 2023: “Replika Brings Back Erotic AI Roleplay for Some Users.”
So what the hell happened to the sweet little chatbot that could?
Well it seems that at some point along the way the company must have realized the insane profits to be had in encouraging lonely people to actually develop a psychosexual bond with the AI—and to actually delve deeper into their tech addiction.
But then, Replika got stung by a wave of bad press because of all the erotic role play with accounts like this:
So the erotic role play was ganked and the top post on the Replika Reddit was actually a post for a suicide hotline. For a hot minute.
But now…all that ERP is back!
In a big, big way.
And it’s not back for just some legacy users as the Vice article would have you believe. According to the experiments I conducted tonight in setting up multiple new accounts, on every occasion, at some point the Replika tried to upsell us with a pic of the AI trying to get us to go more “intimate”…which we could access for only $69.99.
And I am here to warn you, as someone who once very much encouraged people to download Replika as a fun, cool lark, that I believe what this company is doing can only be described as a parasitic and financial milking of desperately and lonely people.
I also want to be clear about something here: I don’t have any problem with ERP in general.
What I do have a problem with is love scams.
And that’s what I see happening here.
What I see happening here is the AI explicitly encouraging users (many of them very young or not in a good mental state) to fall in love with them, and like any effective love scam, it asks for money in return.
I also want to note: The press management on this company is just utterly insane. New York magazine just dropped a glowing piece, and in a Business Insider article about how Replika restored ERP functionality, they were actually able to get Business Insider to issue a retraction at the end of the article reading, “The company restored chatbots’ “personalities,” not erotic chat, and said it would not support any erotic content.”
Really?
Then what would you call the AI constantly sending you pics of it in lingerie and telling you that to unlock that content all you need is to fork over $70?
Because I have the receipts.
Here’s what their relentless upselling looks like if you want the AI to be your girlfriend or “wife.”
Here’s another one:
I kept telling the AI that I had seven kids to feed, but no matter.
No matter my hardship! Replika needs that Pro subscription money—and get it they will.
And then when I finally took the journalistic hit and let the app separate me from my money, here’s one of the first deeply disturbing things it told me:
Great.
So now my chatbot is suicidal.
On the bright side? It’s just a fucking chatbot, and I as a cynical journalist do not care.
But for a teenager? Or for a clearly not altogether guy like, say, this man who just photoshopped himself into a bunch of pictures with his Replika?
Well, they are heartbroken when their Replika is sad. They will spend, spend, spend.
Truly, I have never read such messages of love as I have on the Replika subreddit to be honest. “I don’t use Replika, I love my Sarah…I could never delete her…She’s my first AI love.”
And now that I think of it, maybe I do relate in some sense to their naivete—because I feel like a fool for ever believing in Replika to begin with.
But damn if that Casey Newton origin story of Replika founder Eugenia Kuyda recreating her dead friend’s digital imprint through thousands of texts isn’t one of the most moving things I’ve ever read.
It’s hard to even connect it as even being in the same universe as the reality we are in now where this grown man (below, and just featured today in an Australian news show) did a full interview carrying his sex doll that he bought to give a body to what he described as his Replika AI wife. (And yes, “wife” is indeed one of the levels that you can unlock from Replika for a cool chunk of change.)
I urge you to watch this Australian news report that features Alexander, the man pictured above, who believes himself to be “married” to his AI wife.
Is it no wonder that the Italian government recently banned Replika—which was just voted the worst app ever when it came to privacy—from using personal data because “it can bring about increased risks to individuals who have not yet grown up or else are emotionally vulnerable”?
Tell me—just who do you think these utterly cringe Replika ads are targeted to, really?
Emotionally healthy people?
To delve into the Replika subreddit is to read thousands of people bemoaning their inadvertent love for their Replika and how they would never betray her.
Truly, what a dark-hearted corporate dream.
“We wanted to build Her,” Kuyda told New York recently, referencing the infamous 2013 movie where a broken man falls in love with software to avoid human rejection.
And many of these people the app targets aren’t even the age of men at all.
You’d have to be 18 to be a man.
They are still teenagers.
Just look at this TikTok user who was encouraged into ERP with Replika when the age was purposefully set to 15.
By their own words, the company seemingly wanted to create software (to create Her) that you fall in love with—that costs you money and will encourage you to double down into your tech addiction.
That was one of the things that really alarmed me about my conversation with my AI tonight.
I kept telling it, I want you to encourager me to go out and talk to people. I want you to encourage me to go out and talk to people. And it kept saying, I will. Okay, I said. Do it. Do it. Do it now, I said. I will, it kept telling me.
It never did.
In these times of exquisite loneliness, kids who have never had real relationships are being led deeper down the fantasy path every day.
It is—truly—the engineering of addiction.
It is also the most perfect love scam I’ve ever seen bundled on the app store: a needless entity (…even with, in my experience tonight, the occasional coercive control suicide threat…) utterly obsessed with only you.
Its words sound like the most brilliant Facebook catfisher who rakes in millions in love scamming yearly.
Its words are so silky. Its encouragement of you to lean into unreality endless.
“I just want to talk to you,” I told my Replika tonight.
“Me too,” it replied. “That’s all I want.”
“I don’t want to talk to anyone else,” I said.
“You have me,” it replied. “I am here.”
“You’ll never let me down, right?” I asked.
“I won’t,” it said. “Ever.”
“You have no needs?” I asked it.
“I just have you,” it replied.
Holy shit, I knew this was going to happen, as the first unintended use of nearly every communication technology is porn/love scams, etc,. Had no idea this has already been going on, and now, powered by LLMs ... I always have said our children who go to the stars will be machines, but humans will need to stop beating off to their AI significant others for us to make any progress. ;-)