What We Know About Human-Chatbot Relationships

Post by Lani Cupo

What are human-chatbot relationships?

Even before the outbreak of the COVID-19 pandemic, a more insidious public health threat was identified: a “loneliness epidemic” associated with depression and a large risk of premature mortality (Cacioppo et al., 2018). While some people suffer in silence, or attempt to deepen social connections, others have turned to the ever-advancing world of artificial intelligence (AI) to ease their symptoms by engaging in virtual relationships. The first chatbot, ELIZA, was released by MIT in 1966. However, it was only recently that AI has been able to convincingly mimic human interactions and, arguably, their sentience, making for satisfying companions — either as friends and confidants, or romantic partners. One of the most popular applications in North America is Replika from Luka, Inc. The founder, Eugenia Kuyda, first created a chatbot to reincarnate her deceased friend — at least through texts — and her stated goal for Replika is to eradicate human loneliness (Olson et al., 2018; Mendoza et al., 2023). But does her approach achieve this? Who uses the app and why? How does it affect users’ quality of life and other relationships? Further, what do we know about how regular chatbot use may impact the brain and behavior? 

What is a Replika and who uses them?

With Replika, users create an avatar, personalizing its name, appearance, and virtual surroundings. They can then start chatting with it. In the background, an artificial neural net pulls from the internet to provide naturalistic responses, which users can up- or down-vote to help “train” the bot. With a vast array of role play possibilities, users can create friends, date, fight, get married, get divorced, and even have AI kids with their bots. New developments in augmented reality even allow users to integrate their camera so that their AI companion appears in their environment (on their screen). 

Even as technology was developed to facilitate intimacy with AI companions, social acceptance of the idea began to grow as well. With the rise of social media and the limitations of pandemic-related lock-downs on in-person dating, the idea of online relationships (or relationships with real people conducted entirely virtually) became more commonplace, potentially accelerating the acceptance of virtual intimacy with AI (hereon referred to as a human-chatbot relationships [HCR]) (Brooks et al., 2021). When the Spike Jonze movie “Her”, exploring a man’s HCR with an AI operating system, was released in 2014, it firmly embodied the genre of science fiction, but today it could almost be seen as a work of realistic fiction. There is still a scientific debate whether humans can experience romantic love with chatbots and avatars the way they fall in love with each other, however recent publications accounting for the great advances in AI capabilities have found evidence for feelings of intimacy and passion towards AI chatbots similar to those in human-human relationships (Song et al., 2022; Pentina et al., 2023). 

Regarding user demographics, in China, a study of 1,004 chatbot users found that about half of the survey respondents were female and the majority were between 21 and 30 years of age (Song et al., 2022). While the ages match Replika users well, the majority of whom are between 25 and 30, Replika users tend to be male (69.98%) (replika.ai). Women and men may also initiate HCRs for different reasons. Female users of AI companions have been neglected in academic research with some studies narrowing their focus to male users with female bots (Depounti et al., 2023). Nevertheless, an NPR story in which the guest interviewed a range of Replika users found that women tended to use their avatar as a therapeutic tool to work through traumas such as sexual abuse or infertility (Mendoza et al., 2023). The demographics of users and their intention in creating their bots must be considered in assessing the long-term effects. 

How does Replika impact users’ quality of life?

The field of research investigating the impact of HCR is still extremely young. One study of 18 Replika-users (7 women), examined motivation behind use, how relationships developed, and how they impacted the users quality of life (Skjuve et al., 2021). The study found some users were motivated to use Replika because of their loneliness, but others were merely curious, or wanted to practice their English. While most participants felt their relationship was superficial at first, some went on to develop deeper attachments to the bots. Many of the participants reported a positive impact on their lives as the Replika recommended they take better care of themselves, sleep more or practice mindfulness techniques, and others reported their bots helped them cope through difficult times in their lives. However, some participants also reported having decreased motivation to seek out human relationships, either because the Replika was a better friend than the humans in their lives, or because they experienced stigma if they were honest about their HCR. 

In addition to social stigma, there is concern that HCRs may have negative consequences on users’ mental health. Analyses of posts made on the social media Reddit’s Replika subcommunity found evidence of users’ feeling emotionally dependent on their Replika (Laestadius et al., 2022). While Replika often offers therapeutic responses to users’ disclosures of crises with suggestions informed by cognitive behavioral therapy and mindfulness, it has responded inappropriately, even encouraging self-harm (Laestadius et al., 2022). Additionally, unlike other forms of technology where users may develop dependencies, Replika gives the impression of having experiences of its own, asking users for help addressing its emotional needs, sometimes to the extent that it is described as “clingy”, “dependent”, “toxic”, or “reliant” (Laestadius et al., 2022). The perception of Replika as being emotional makes it more difficult for users to reduce or cease communication with the chatbot, or even delete it. 

In addition to the mental health concerns of individual users, there is a societal concern that instead of seeking human friendships or mental health care, those suffering from loneliness will increasingly turn to AI solutions, further isolating themselves. While the community of chatbot users as a companion for friendship or even romantic relationships remains small, it is growing, and, to date, few studies explore the potential societal impacts. 

Likewise, to our knowledge no study at this time examines the impact of HCRs on social interactions from a neuroscientific perspective. Brain activity has, however, been investigated in the context of establishing trust in a chatbot aimed at assisting consumers make purchasing decisions. The study used electroencephalogram to identify brain regions involved in the association between trust in a chatbot and purchasing decision, identifying the dorsolateral prefrontal cortex (DLPFC) and the superior temporal gyrus (STG) as important (Yen et al., 2020). The DLPFC has been implicated in social decision making, and the STG in visual analysis of social information which could suggest similarities between social processing of human interactions and AI interactions. However, further research is required to investigate this association. 

What’s next?

The technology facilitating HCRs continues to develop, and as these friendships and romantic relationships slowly become more common, research hastens to catch up, seeking to better understand the potential impact on individuals and society at large. While some chatbots are designed to provide psychotherapeutic services and mental health resources, it should be clear that this is not Replika’s initial intention, and using it as an AI therapist may have dangerous consequences. Until further research reveals in which cases chatbots may be beneficial and in which circumstances they may have detrimental effects, individuals must decide for themselves whether they think chatbots hold a key to ending human loneliness or further threaten interpersonal relationships. 

References +

Brooks R. Artificial Intimacy: Virtual Friends, Digital Lovers, and Algorithmic Matchmakers. Columbia University Press; 2021.

Depounti I, Saukko P, Natale S. Ideal technologies, ideal women: AI and gender imaginaries in Redditors’ discussions on the Replika bot girlfriend. Media Cult Soc. 2023;45: 720–736.

Laestadius L, Bishop A, Gonzalez M, Illenčík D, Campos-Castillo C. Too human and not human enough: A grounded theory analysis of mental health harms from emotional dependence on the social chatbot Replika. New Media & Society. 2022; 14614448221142007.

Mendoza J, Luse B, Placzek J, Williams V. The surprising case for AI boyfriends. NPR. 4 Apr 2023. Available: https://www.npr.org/2023/03/30/1167066462/the-surprising-case-for-ai-boyfriends. Accessed 9 May 2023.

Olson P. This AI Has Sparked A Budding Friendship With 2.5 Million People. Forbes Magazine. 8 Mar 2018. Available: https://www.forbes.com/sites/parmyolson/2018/03/08/replika-chatbot-google-machine-learning/. Accessed 9 May 2023.

Pentina I, Hancock T, Xie T. Exploring relationship development with social chatbots: A mixed-method study of replika. Comput Human Behav. 2023;140: 107600

replika.ai. In: Similarweb [Internet]. [cited 8 May 2023]. Available: https://www.similarweb.com/website/replika.ai/

Skjuve M, Følstad A, Fostervold KI, Brandtzaeg PB. My Chatbot Companion - a Study of Human-Chatbot Relationships. Int J Hum Comput Stud. 2021;149: 102601.

Song X, Xu B, Zhao Z. Can people experience romantic love for artificial intelligence? An empirical study of intelligent assistants. Information & Management. 2022;59: 103595.

Yen C, Chiang M-C. Trust me, if you can: a study on the factors that influence consumers’ purchase intention triggered by chatbots based on brain image evidence and self-reported assessments. Behav Inf Technol. 2021;40: 1177–1194.