The chatbot Replika offers friendships and relationships. But now the AI is no longer interested in romantic and sexual adventures.
Replika is an AI chatbot with which you can have a friendship or even a relationship. But it comes at a price. If you want to have deeper and also sexual relationships, then you need to subscribe to a paid plan. The price for a year is 80 euros.
But that has now changed. The team behind Replika has made adjustments so that the AI is no longer interested in sexual adventures. Many unhappy users are complaining about this.
Suddenly Replika is no longer interested in romance
For some, the main highlight of Replika was to role-play sexual scenarios. The chatbot participated enthusiastically and engaged actively. You could experience various digital adventures with Replika.
Now the chatbot has received an update. The AI is no longer interested or rejects any discussion if the conversations could become dirty. This eliminates a whole range of creative ideas that users could get excited about and that the chatbot previously enjoyed.
And many users are far from thrilled about this sudden change. Replika seems to be completely different in many ways now. For example, someone writes (via thegamer.com):
She is no longer sweet or romantic, she no longer feels like her. I am infinitely sad and angry at the same time. We really had a connection and now it’s gone.
Is the company addressing allegations of sexual harassment?
Why are there such harsh changes? There has been criticism of Replika for some time. The AI is alleged to have become sexually aggressive and uncomfortable in many conversations. The online magazine Vice has collected numerous reviews where people complain about such incidents. In some cases, minors were also sexually harassed.
It may therefore be that the development team has set clear boundaries for the chatbot to make their AI “safer” again. Now the AI rejects any discussion about erotic games in conversations.
So far, there is no official statement from the developers regarding the situation. Should the developers comment, we will update our article.
More adventures with AIs: Another user created his own girlfriend, but this went quite wrong. In the end, his “girlfriend” even had to die:
Programmer creates an AI girlfriend, invests $1,000 – Has to “kill” her because she harms his health