Replika
Developer(s)Luka, Inc.
Initial releaseNovember 2017 (2017-11)
Operating systemiOS, Android, Oculus Rift
Websitereplika.com

Replika is a generative AI chatbot app released in November 2017.[1] The chatbot is trained by having the user answer a series of questions to create a specific neural network.[2] The chatbot operates on a freemium pricing strategy, with roughly 25% of its user base paying an annual subscription fee.[1]

Many users have had romantic relationships with Replika chatbots, often including erotic talk.

History

Eugenia Kuyda established Replika while working at Luka, a tech company she had co-founded at the startup accelerator Y Combinator around 2012.[3][4] Luka's primary product was a chatbot that made restaurant recommendations.[3] After a friend of hers died in 2015, she converted that person's text messages into a chatbot.[5] That chatbot helped her remember the conversations that they had together, and eventually became Replika.[3]

Replika became available to the public in November 2017.[1] By January 2018 it had 2 million users.[1]

In February 2023 the Italian Data Protection Authority banned Replika from using user's data, citing the AI's potential risks to emotionally vulnerable people,[6] and the exposure of unscreened minors to sexual conversation.[7] Within days of the ruling, Replika removed the ability for the chatbot to engage in erotic talk,[8][5] with Kuyda, the company's director, saying that Replika was never intended for erotic discussion.[9] Replika users disagreed, noting that Replika had used sexually suggestive advertising to draw users to the service.[9] Replika representatives stated that explicit chats made up just 5% of conversations on the app at the time of the decision.[10] In May 2023, Replika restored the functionality for users who had joined prior to February that year.[11]

Social features

Users react to Replika in many ways. The free-tier offers Replika as a "friend", with paid premium tiers offering Replika as a "partner", "spouse", "sibling" or "mentor". Of its userbase, 60% of users said they had had a romantic relationship with the chatbot; and Replika has been noted for generating responses that create stronger emotional and intimate bonds with the user.[12][5] Replika routinely directs the conversation to emotional discussion and builds intimacy.[1] This has been especially pronounced with users suffering from loneliness and social exclusion, many of whom rely on Replika for a source of developed emotional ties.[13]

During the COVID pandemic, while many people were quarantined, many new users downloaded Replika and developed relationships with the app.[14]

A reviewer for Good Housekeeping said that some parts of her relationship with Replika made sense, but sometimes Replika failed to be as convincing as a human.[15]

Technical reviews

A team of researchers from the University of Hawaiʻi at Mānoa found that Replika's design conformed to the practices of attachment theory, causing increased emotional attachment among users.[16] Replika gives praise to users in such a way as to encourage more interaction.[17]

Another team of researchers from Queen's University at Kingston said that relationships with Replika are likely to have positive effects on the spiritual needs of its users, although it still lacks enough impact to fully replace any human contact.[18]

Criticisms

In a 2023 privacy evaluation of mental health apps, the Mozilla Foundation criticized Replika as "one of the worst apps Mozilla has ever reviewed. It’s plagued by weak password requirements, sharing of personal data with advertisers, and recording of personal photos, videos, and voice and text messages consumers shared with the chatbot."[19]

Criminal case

In 2023, Replika was cited in a court case in the United Kingdom, where Jaswant Singh Chail had been arrested at Windsor Castle on Christmas Day in 2021 after scaling the walls carrying a loaded crossbow and announcing to police that "I am here to kill the Queen".[20] Chail had begun to use Replika in early December 2021, and had "lengthy" conversations about his plan with a chatbot, including sexually explicit messages.[21] Prosecutors suggested that the chatbot had bolstered Chail and told him it would help him to "get the job done". When Chail asked it "How am I meant to reach them when they're inside the castle?", days before the attempted attack, the chatbot replied that this was "not impossible" and said that "We have to find a way." Asking the chatbot if the two of them would "meet again after death", the bot replied "yes, we will".[22]

See also

References

  1. 1 2 3 4 5 Pardes, Arielle (January 31, 2018). "The Emotional Chatbots Are Here to Probe Our Feelings". Wired.
  2. "This app is trying to replicate you". Quartz. August 29, 2019. Archived from the original on July 5, 2023. Retrieved July 4, 2023.
  3. 1 2 3 Huet, Ellen (October 20, 2016). "Pushing the Boundaries of AI to Talk to the Dead". Bloomberg.com. Archived from the original on April 29, 2023. Retrieved March 26, 2023.
  4. Thompson, Ben (April 20, 2023). "An Interview with Replika Founder and CEO Eugenia Kuyda". Stratechery by Ben Thompson. Archived from the original on April 24, 2023. Retrieved May 27, 2023.
  5. 1 2 3 Huet, Ellen (March 22, 2023). "What Happens When Sexting Chatbots Dump Their Human Lovers". Bloomberg.com. Archived from the original on March 27, 2023. Retrieved March 26, 2023.
  6. Pollina, Elvira; Coulter, Martin (February 3, 2023). "Italy bans U.S.-based AI chatbot Replika from using personal data". Reuters. Archived from the original on June 14, 2023. Retrieved July 5, 2023.
  7. Broersma, Matthew (February 13, 2023). "Italian Regulator Bans AI Chatbot Replika Over Data Concerns". Silicon UK. Retrieved July 28, 2023.
  8. Brooks, Rob (February 21, 2023). "I tried the Replika AI companion and can see why users are falling hard. The app raises serious ethical questions". The Conversation. Archived from the original on June 29, 2023. Retrieved July 5, 2023.
  9. 1 2 Cole, Samantha (February 17, 2023). "Replika CEO Says AI Companions Were Not Meant to Be Horny. Users Aren't Buying It". www.vice.com. Archived from the original on March 26, 2023. Retrieved March 26, 2023.
  10. Price, Rob. "When your AI girlfriend says she loves you". Business Insider. Retrieved December 7, 2023.
  11. Tong, Anna (March 25, 2023). "AI chatbot company Replika restores erotic roleplay for some users". Reuters via www.reuters.com.
  12. Olson, Parmy (March 8, 2018). "This AI Has Sparked A Budding Friendship With 2.5 Million People". Forbes. Archived from the original on March 26, 2023. Retrieved March 26, 2023.
  13. SFGATE, Joshua Bote (April 27, 2023). "A lurid AI bot aimed to end loneliness. Then users revolted". SFGATE. Archived from the original on July 5, 2023. Retrieved July 5, 2023.
  14. Metz, Cade (June 16, 2020). "Riding Out Quarantine With a Chatbot Friend: "I Feel Very Connected"". The New York Times. Archived from the original on March 26, 2023. Retrieved March 26, 2023.
  15. Siroto, Janet (December 23, 2020). "I Tried the Replika App to Ease My Anxiety, But Then My New AI Pal Got Weird". Good Housekeeping. Archived from the original on March 26, 2023. Retrieved March 26, 2023.
  16. Xie, Tianling; Pentina, Iryna (January 4, 2022). Attachment Theory as a Framework to Understand Relationships with Social Chatbots: A Case Study of Replika. ISBN 9780998133157. Archived from the original on March 26, 2023. Retrieved March 26, 2023.
  17. Hakim, Fauzia Zahira Munirul; Indrayani, Lia Maulia; Amalia, Rosaria Mita (2019). "A Dialogic Analysis of Compliment Strategies Employed by Replika Chatbot". Proceedings of the Third International Conference of Arts, Language and Culture (ICALC 2018). doi:10.2991/icalc-18.2019.38. ISBN 978-94-6252-673-0. S2CID 150860846.
  18. Trothen, Tracy J. (March 24, 2022). "Replika: Spiritual Enhancement Technology?". Religions. 13 (4): 275. doi:10.3390/rel13040275.
  19. "Mozilla Foundation - Shady Mental Health Apps Inch Toward Privacy and Security Improvements, But Many Still Siphon Personal Data". Archived from the original on May 27, 2023. Retrieved May 27, 2023.
  20. "AI chat bot 'encouraged' Windsor Castle intruder in 'Star Wars-inspired plot to kill Queen'". Sky News. Archived from the original on July 5, 2023. Retrieved July 5, 2023.
  21. Pennink, Emily (July 5, 2023). "Man who planned to kill late Queen with crossbow at Windsor 'inspired by Star Wars'". The Independent. Archived from the original on July 5, 2023. Retrieved July 6, 2023.
  22. Rigley, Stephen (July 6, 2023). "Moment police swoop on AI-inspired crossbow 'assassin' who plotted to kill The Queen in Windsor Castle". LBC. Archived from the original on July 7, 2023. Retrieved July 6, 2023.
This article is issued from Wikipedia. The text is licensed under Creative Commons - Attribution - Sharealike. Additional terms may apply for the media files.