May 11, 2040 – After a run, Sara got into the shower. She was tense for her upcoming date. She zipped up her tight dress and put on her bright red lipstick, hoping her mischievous look will seduce the new lover. Sara has been chatting with Joe for the past few weeks, but this would be her first time seeing him in person. She couldn’t wait to see, smell and feel Joe. He has many things in common with her. They both love thrillers and Berlin is their favorite city. When they finally met, sparks flew. He was everything she wanted in a man. Joe had no choice but to meet her needs. After $10,000 and lots of personal testing, he was basically a carefully tailored version made exclusively to suit her desires. Joe is not human, he is a robot. However, everything else is real about him. The connection. Sexual attraction. The programming within Joe operates smoothly, except for a beating natural heartbeat.
The above-described dystopian scenario of people forming emotional bonds with robots is closer to reality than one would expect. Robophilia is increasingly explored in popular media such as films Spike Jonze’s Her (2013), and personal chat bots such as Replika are already providing millions of users with a virtual romantic partner. A global survey found that a quarter of millennial respondents believed that romantic relationships with robots will become commonplace in the future, and that many are open to exploring this idea for themselves. The loneliness epidemic, the rise of AI and the reliance on technology are creating the roots for a booming market in personalized social robots. Here the limitations of AI lovers are explored, and how these robotic relationships fall short of their promises and can even be harmful. Rather than combating loneliness and isolation, these AI entities contribute to reinforcing contemporary feelings of alienation while raising a host of new ethical concerns.
The Unbridgeable Gap Between Humans and Robots
To begin with, it is evident that robots are unable to provide the essential physical intimacy that human beings deeply yearn for. This is also demonstrated in Her (2013), as the protagonist’s relationship with the operating system lacks physical intimacy and a human female is hired to play surrogate. It’s an uncomfortable comic scene: the woman arrives, puts on a microcamera like a beauty spot, and provides the body to go with Samantha’s voice in a virtual threesome of dizzying disconnection.
Alternatively, physical intimacy can be simulated using so-called “sex robots”. These artificially intelligent robots are designed to simulate the physical aspect of sexuality by incorporating various physical features, such as genitalia, to simulate sex. Despite their physical features, experts still argue that true intimacy and emotional connection cannot be replicated by a machine.
Futurologist Dr. Ian Pearson:
“Although robotic sex partners, realistic dolls, and virtual reality offer exciting opportunities to explore our sexual fantasies, it’s unlikely that they’ll completely replace human relationships anytime soon.”
On top of that, humans should not have emotional relationships with robots because there is a lack of reciprocity in a human-robot relationship. As the British behavioral scientist Dylan Evans argued: “Although people typically want commitment and fidelity from their partners, they want these things to be the fruit of an ongoing choice”. Robots will not choose to love you; they will be programmed to love you, in order to serve the commercial interests of their corporate overlords. This means that the love and emotional support that robots provide is not based on a genuine choice, but rather on a predetermined set of instructions and algorithms created by their corporate creators. According to experts, the absence of emotional reciprocation makes it challenging for humans to cultivate authentic and satisfying emotional ties with robots. The debate on the feasibility of sentient AI remains unsettled, making its depiction in popular media more fantasy than science fiction. While it may be possible to simulate some aspects of physiological inputs, the complexity and intricacy of the signals that people receive from their bodies make it improbable that robots will ever have truly human emotions.
The Threat to Our Social Fabric
Even acknowledging that humans and social robots are fundamentally different, it is worth considering the extent to which our perception of social robots shapes our view of humanity. The advent of social robots threatens to blur the line between the human and the non-human, raising questions about the nature of love, relationships and the inherent dignity of humanity. Kathleen Richardson, anthropologist and leader of the Campaign Against Sex Robots, warns that society could easily move from seeing robots as mere objects for sexual gratification to treating other people the same way. As also covered in the scientific literature, the concept of humanoid sex robots can be dehumanizing because it reinforces the perception of humans as objects of desire, or even recipients of abuse. Richardson therefore sees the technology as rooted in the historical legacy of slavery and sexual exploitation.
Kathleen Richardson:
“We might even progress from treating robots as instruments for sexual gratification to treating other people that way.”
Furthermore, robophilia can negatively affect the quality of human-human relationships. With the ease of establishing relationships with robots, humans risk becoming dependent on technology and losing the art of human interaction. This was also a common criticism of the Japanese dating simulation, with many media commentators viewing the games as reflecting societal alienation and a withdrawal from real human connection. There are already cases of early adapters using their robot partners as full replacements for monogamous romance, including a man marrying his favorite dating sim character. Whether this phenomenon will ever become widespread remains to be seen.
Another potential drawback of robot relationships is driven by the commercial interests of the private companies behind their development. User safety and privacy are not necessarily a primary concern for these firms. In order for social robots to be truly personalized and optimized, a significant amount of personal data needs to be collected. The possibility of a third party intercepting or hacking into these conversations cannot be ruled out, putting users at risk of revealing their most sensitive and intimate details. In the absence of adequate regulation, companies may resort to underhanded tactics, such as convincing users at vulnerable moments to make certain in-app purchases. In addition, gender and racial biases can seep into the design of social robots, which has the potential to perpetuate harmful stereotypes and reinforce existing biases.
An important advantage often mentioned in relation to the company of robots is that they are uniquely reliable and always there when you need them. The constant availability of a robot partner could provide comfort and support for individuals who are lonely or isolated. Nevertheless, the lack of variation and unpredictability can also become monotonous and unfulfilling in the long term. Moreover, the long-term reliability of social robots is still questionable. With just a single software update or a company decision to discontinue their services, these machines can become obsolete and disappear overnight. For example, the Japanese man who married a hologram could no longer communicate with his virtual wife because the software was no longer supported. Constant availability can also lead to a further retreat from social life. For example, dating simulation app Mystic Messenger has an entire forum for addicts, some of whom play for at least six hours per day.
Another suggested benefit of human-robot relationships is that robots can be programmed not to judge and to accept their human partners. This can provide a sense of comfort and security for individuals who may feel judged or rejected by others in their personal or professional lives. Proponents suggest this may help people with social anxiety or phobias face and overcome their fears. But while constant affirmation can make us feel good in the moment, it can also hinder personal growth and development in the long run. Receiving criticism is essential to recognizing and correcting our mistakes, and our reliance on technology designed with a commercial imperative to make us feel better may hinder this process. For the treatment of psychological problems such as anxiety, human therapists remain better equipped than an often faulty chatbot. Besides, AI relationships can create a false sense of security and ultimately even exacerbate the fears and stress people experience in real social situations, where interlocutors are not as predictable or agreeable as their digital counterparts.
The bottom line is that these human-robot interactions are transactional and not reciprocal, and therefore not healthy for most people to rely on as a long-term means for substituting organic two-way affectionate bonds, or as a surrogate for a human-human-shared relationship. Furthermore, robots can undermine human relationships, ultimately making us lonelier and thereby defeating their original purpose.