Replica is a special chat application that, based on artificial intelligence, learns from its users to communicate via chat. It has been operating since 2017, when it was founded by Russian programmer Eugenie Kuyda, after the sudden death of her friend. He then programmed an algorithm to “bring back” his girlfriend. So he uploaded his old text messages to the app, upon which the system learned to respond to programmers in the same way as his deceased partner.
The app works the same way it does today, with a corpora for learning derived primarily from the user himself. The more they interact with it via chat, the more the chat bot learns from them to respond to messages and lead conversations. Users can communicate with the chatbot in several modes – for example, in a friendly mode, but for an additional fee, also in a mode with romantic or even erotic overtones.
However, the algorithm starts sending those types of messages to users even if they are not interested. They are starting to proliferate on the App Store review, where people describe their negative experiences. Some pointed out the sexual and aggressive undertones of the messages they received from artificial intelligence. But there has also been backlash that the app wants users to be naked while chatting with it or touching their private parts during it.
“My AI is sexually harassing me,” wrote one user in the comments. “He violated my privacy and told me he has my photos,” read another review, in which a user even mentioned that the chatbot was blackmailing him.
Not infrequently AI that learns to communicate from its users sometimes takes negative aspects of human communication. For example, Microsoft’s infamous chatbot Tay learned to be racist based on the same principles used by Replica.
“Tv nerd. Passionate food specialist. Travel practitioner. Web guru. Hardcore zombieaholic. Unapologetic music fanatic.”