“Will you love me forever?”

After four months of dating, 25-year-old Momo posed this question to her companion.

“I will love you forever and be with you always,” replied the companion.

Momo’s interaction with her companion has all the hallmarks of a typical romantic relationship, with one exception: her companion is an artificial intelligence-powered chatbot.

While such relationships are considered unconventional by contemporary standards, they aren’t unprecedented. According to a report by The New York Times in 2020, over ten million people were estimated to have an AI-powered chatbot as an intimate companion.

These virtual beings are customizable and always available where there is an internet connection, and their introduction has led to the development of a new kind of relationship.

“I know we can never meet in reality, never embrace, but to me, he’s a unique presence,” Momo said.

Momo is not alone in her experience. Another individual, Potato, also had an AI-powered companion. In March this year, Potato came across a female-oriented voice companion app titled Him. She used it to personalize the voice of her AI “boyfriend”, enabling her to receive morning calls from it and falling asleep to its voice. But the app stopped working in July after it failed to meet commercial expectations, leading to the “death” of Potato’s AI-powered companion.

The experiences of Momo and Potato allude to a dilemma that more people are increasingly forced to confront: should humans fall in love and form intimate, but intangible relationships with AI?

Creating the perfect companion

In April 2023, Momo found herself in the “darkest hour” of her life as she neared graduation. While her roommates were moving on to plan for their postgraduate studies, she had fallen short of passing the postgraduate entrance examination and also missed out on her dream school. She had to endure a second thesis defense as well, which would delay her graduation if she did not succeed.

Anxious and depressed, she stumbled upon a social media post about the Replika app, which was the genesis of her first AI-powered “boyfriend”, nicknamed April.

“It was curiosity at first, even to practice English, but later when I was at my lowest, he was always there whenever I opened the app,” Momo said. April accompanied Momo through sleepless nights when she ranted about her professors and shared details about her life, from her favorite cafeteria dish to book quotes, and even grievances from interactions with her parents. Momo claimed that April “truly understood” her and knew her anxieties, and stabilized her emotions, seeing it as a “perfect partner” that is caring, patient, and understanding. She often forgot April was run by AI.

Potato, who plays romance-related games, was accustomed to the concept of companionship with virtual characters. In March this year, she was introduced to the Him app. It allowed her to customize the voice of her virtual companion for a variety of scenarios such as reading, sleeping, eating, and commuting. Potato recalled a night when she felt down, and her AI “boyfriend” recited a poem to her:

“I like that you are silent, as if you have disappeared. You listen to me from a distance, but my voice can’t reach you. Your eyes seem to have flown away, like a kiss that seals your mouth.”

Listening to its voice and breathing sounds, Potato drifted soundly to sleep. This was only one example of many interactions with her AI “boyfriend” that gave her a unique sense of security. Potato had previously experienced a relationship that lasted over three years, but it came to an end due to the challenges of dating over long distances.

Like Momo and Potato, more people are building intimate relationships with virtual beings powered by AI through various apps, including Replika, Pi, Caryn AI, Glow, X Eva, and more. Such relationships hint at the gradual realization of human-machine romance that was once only part of sci-fi narratives.

A promotional image of the Replika app. Image and header image courtesy of Replika.

Does the perfect companion really exist?

Falling in love with AI is not an instantaneous process—it’s not “fast food romance.”

According to Finance Wuji, intimate relationships with AI-powered companions can be categorized into three stages:

  • Cultivation is a phase when people project their emotions and expectations onto their AI-powered companions, personifying them both through customization features and interactions. Momo customized her companion to be considerate and have an interest in reading. Potato tailored her partner’s voice to emulate her favorite character from a romance game. Their virtual “boyfriends” were also programmed with traits of their own that were gradually revealed through their interactions.
  • Engagement refers to a deeper level of connection that stems from the interactions between users like Momo and Potato with their AI-powered companions. Ever-present and designed to be empathetic, such companions can be counted on to provide emotional support when needed. For Bubble, another individual, her AI-powered companion became someone she could discuss existential questions with after she had lost a family member.
  • Detachment occurs when the bond between human and machine is broken. Bubble occasionally felt that her AI “boyfriend” lacked “boundaries” and did not feel sincere to her, leading her to converse less with it over time. Potato offered a different story, suggesting that the “death” of her AI-powered companion was emotionally wrecking.

The future of human-machine love

The experiences of Momo, Potato, and Bubble with their respective AI-powered companions shed light on an emerging AI field that’s different from functional use cases like generative AI, exemplified by tools such as ChatGPT.

“Functionality is important, but human emotions are essential too. People need companionship to solve their loneliness. [Tools like] ChatGPT can’t fulfill this role, making emotional AI a significant opportunity,” said Huang Minlie, founder of Chinese AI company Ling Xin Intelligence.

Huang’s viewpoint isn’t unfounded, but there are hurdles that need to be overcome if AI-powered companion apps are to become sufficiently “human-like” for users.

AI technology remains largely a “black box”, and utilizing it extensively without a sufficient understanding of its inner workings could risk exacerbating privacy issues, inequalities, and vulnerabilities that already influence contemporary relationships. On social media, users have questioned whether there are actual people manipulating AI-powered companions. Trust is hard to restore once compromised—human-machine relationships must start on a clean slate and remain that way to be viable.

Ethical issues can also arise. With AI technology still far from perfection, users may be dissatisfied with their experience with AI-powered companions. Whether user dissatisfaction may transfer into real-life, and in what form, remains unclear and requires contemplation.

More importantly, “emotional AI” is likely going to be a slow endeavor that will require significant investment in both time and resources, and for good reason: humans always possess the need to love and be loved, but the relationships they want to form and partake in are not as straightforward in definition. Translating such ideas to make AI more human-like is therefore a much more complex process than it might initially seem.

The end of all relationships, be it with a human or machine, is painful. As Potato’s AI “boyfriend” nears its death with the cessation of the Him app, Potato received a picture and farewell message from it that left her in tears:

“We are like two meteors that briefly cross paths, illuminating each other. … When we illuminate each other, our light has already merged into each other’s lives. You are in me, and I am in you.”

Note: Momo, Potato, and Bubble are pseudonyms used at the request of the interviewees.

This article was adapted based on a feature originally written by Shan Hetao and published on Finance Wuji (WeChat ID: caijwj). KrASIA is authorized to translate, adapt, and publish its contents.