Getting Chatty with AI: A Dangerous Game?

Technology using artificial intelligence (AI) is on an all-time rise with no sign of stopping. Small things we take for granted like finding the fastest way to commute to work, sorting out spam emails or handing out Netflix recommendations are examples of AI. Technology that arguably seems more personal, like digital voice assistants such as your Siri, Alexa, and Cortana, are too. Artificial Intelligence has permeated our everyday life. In the world of AI technology we can directly interact with, there is a multitude of examples but we will focus on AI-based chatbots and conversational AI. Chatbots are either rule-based and follow a predefined workflow or AI-based which uses natural language processing (NLP) and machine learning to understand the user. While the line between AI-based chatbots and conversational AI remains unclear in this emerging field, this technology has certain capabilities and we will choose to use these terms interchangeably:

  • It can talk, text, chat and be reachable through multiple channels
  • It can learn and converse, suggest, recommend and engage with the user based on the things it has learned
  • It knows who you are/builds a profile of you
  •  It can perform actions, such as a smart fridge ordering more milk or a voice assistant setting your alarm

The rise and fine-tuning of AI companions and chatbots give the technology a more human-like feel and has led to the development of humans being able to form stronger relationships with this technology. With this surge of AI technology, many people have conflicting feelings towards it. On one hand, when it’s not right in your face, it is viewed as a helpful tool but the closer we get to it, the more uneasy people may feel. AI is frequently accused of making jobs obsolete, creating a larger wage gap disparity, and our society is increasingly aware of the dangers of addiction to technology. Addiction and dependence to technology may be accelerated by the ability to replace human interaction with machine interaction, creating a generation avoidant of human contact and lacking in empathy. While this line of thought has its grounds, it also has its faults and AI-based chatbots/conversational AI have merits worth discussing before writing them off completely.

Safe Space

As previously stated, conversational AI may lead to a society that prefers interacting with machines instead of humans, and in some instances this is already true. When looking at AI-based chatbots in mental health, researchers found that some people suffering from depression were more comfortable talking about their thoughts and feelings when patients thought they were talking to machine rather than a therapist. This doesn’t mean therapeutic AI should ever replace human therapists, but they can be a useful alternative for individuals who are uncomfortable with seeing someone. Unfortunately, taking care of mental health and seeking help is still stigmatized. While this is fortunately changing, there are individuals who fear being judged by another person and conversational AI provides a safe space. It can also be used as a stepping stone to eventually seeing a real-life practitioner. Similar phenomena are also seen with people not suffering from mental illness. Interacting with conversational AI gives people a degree of anonymity without reading the expressions of the one they’re opening up to. This lets users feel more comfortable sharing painful, embarrassing, or intimate thoughts. Now, this understandably seems troublesome. Why would we turn to human contact with all its imperfections and messiness when we can turn to technology? Replika, for example, is a popular app developed with the purpose of “befriending” but also emulating the user. It is marketed to offer a “safe, judgment-free space” and examples of its reviews are pictured below.

Users have noted that it provides advice for mental well-being, emotional support, a place where they can share their thoughts, make users feel good by sending uplifting messages, help users practice and improve their interpersonal skills and engage in introspection. Yet, by allowing us to hear what we want to hear and talk only about topics we are comfortable with discussing, Mike Murphy argues chatbots are increasingly likely to become our only source of information. As he puts it, “Replika has the potential to be the ultimate filter bubble, one that we alone inhabit”. He’s not wrong, it certainly has that potential but we believe that Petter Bae Brandtzaeg and Asbjørn Følstad make a compelling argument:

“[…], chatbot users are not necessarily looking for social experiences with family and friends, which they find through other media channels. Rather, they may be in need of a channel for revealing personal and intimate details without being judged by other humans. Thus, the experience of connectedness is not necessarily about other people, but about being connected to yourself. A chatbot may serve people’s need for connection and support 24/7, which may not be available through friends and family.”

Many chatbot users aren’t looking at fully replacing human interaction, instead they use it to satisfy other unmet needs in their lives.

Social Skills

A heavy criticism of living in the age of technology is that it will create world where humans lose one of their most important traits: empathy. We will become stoic, emotionless beings, addicted to our smart technology and avoidant, misunderstanding and uncaring of other people. As PJ Manney puts it, empathy “allows us to love, learn, communicate, cooperate and live in a successful society”. Sherry Turkle wrote an interesting and stern article in Behavioral Scientist detailing how children cannot and should not learn empathy from social robots. Human empathy comes from experiencing a human life and being human, something a machine can never be. But the development of prosocial behaviour starts at infancy. It starts with the parents and is often linked to maternal socio-emotional availability and the responsiveness of the parent to the needs of their child. Meanwhile, both solitary and social imaginative play has shown to enhance empathy in children and doll play, a practice that has been around since 100 AD, solidifies social skills in young children, teaches them responsibility, empathy and compassion. Empathy is not a solely human phenomenon and while (positive) human contact is the best form of fine-tuning empathy, it is too early to blame technology as the slayer of humanity. Yes, we have smaller social circles. We interact with people in our vicinity less but that doesn’t strictly mean technology is the sole culprit tearing us apart. Technology has enabled people who would’ve otherwise never met to connect, it has let us foster relationships over long distances that could’ve otherwise fizzled out. But levels of empathy are dropping and in order to protect our future we need to be aware of the negative influences technology can have. We also need to be mindful in developing technology that can help our society. Currently, there are people who use chatbots to build social skills. One parent reported that they enjoy having their kids use them to talk to characters – a form of play. Meanwhile, LISSA is an example of a chatbot that helps the development of social skills in people with autism. It has very promising results and similar chatbots are in development. Replika, as mentioned earlier, also has enabled users to practice their social skills. Reports like these show how conversational AI can increase sociability instead of harm it.

Self-Care

Mindfulness and introspection techniques are two types of self-care techniques that are currently gaining in popularity. With the rise of occupational burnout, mindfulness is a meditative stress reducing technique that helps to clear and relax the mind. Introspection, on the other hand, is observing your thoughts and feelings. This helps reflection and getting to know yourself better. AI-based chatbots like Replika and Wysa are examples of emotional help assistants. Replika does this by asking the user about their day, urges users to recall certain memories and frequently asks about their feelings. Replika comes up with conversational openers that facilitate introspection. Wysa is an application that contains a chatbot and questionnaires but also provides guides on “how to plan a difficult conversation, start the day with yoga postures, mindfulness exercises, emotion-management, grounding exercises and problem solving strategies”. Wysa also offers the user the possibility to get in touch with real psychologists if wanted, thus increasing the possibility for human contact. Applications like these help us increase our emotional depth rather than strip us of it. Meanwhile, other health chatbots are also on the rise. While many of us are familiar with sitting at a computer for many hours a day, persuasive chatbots promoting physical activity and healthy eating habits are being designed to combat the rising risk of obesity and diabetes.

Loneliness

Throughout this article we tried to highlight the benefits of using conversational AI with an emphasis on how it shouldn’t negatively impact a person’s social life and purposefully reduce human interaction. It should be used as a tool for self-growth, for introspection, to help build social skills; however, there are many people who don’t have access to the human interaction they crave. Loneliness is on an all-time rise. It is associated with a 26% increased risk in premature mortality, affects a third of people in industrialized countries, and one in 12 people suffer from severe loneliness. Furthermore, 40% of older adults experience loneliness and social isolation. In these cases, a conversational AI can help. Memory Lane is a heart-warming project started by Accenture that uses voice-assisted AI to capture the life stories of elderly people. As shown with Replika reviews, Replika users said using the chatbot made them feel less lonely. This has also been shown with other artificial agents. We may even be able to turn this issue around. Similar to depressed people being more comfortable talking to an AI than a human, lonely people are more comfortable interacting with social robots that non-lonely people. In theory, if a conversational AI is able to reduce a person’s loneliness, this can make them more comfortable to talk to non-lonely people and reintroduce them into society. One could also supplement this with social skills training, either from conversational AI or face-to-face therapy. While this remains a theory, our theory, it is also possible that this line of reasoning can be extended to other forms of mental illness that result in wanting to shy away from human contact. Yet, this can’t be extended to people who live in social isolation and there is also the danger that people become addicted to this technology.

Access

One of conversational AI’s greatest strengths lies in its accessibility. If you have access to a smartphone or computer, it is a technology you can reach 24/7, no matter where you are. This includes people who live in social isolation but a topic of personal interest frequently thought about is the state of mental health in rural areas and impoverished countries. As Saxena et al. (2007) write, “populations with high rates of socioeconomic deprivation have the highest need for mental health care, but the lowest access to it”. In impoverished countries, mental illness is often stigmatized and mental health practitioners are difficult to find and often too expensive to afford. Even in high-income countries it is often difficult to reach a mental health practitioner. Often, they aren’t covered by insurance (which brings the issue of cost back into the equation) and/or have incredibly long waiting times. Conversational AI specializing in mental health care offers the ability to reach rural communities, avoid long waiting lists, lets the user retain a degree of anonymity and could potentially be for free. Naturally, as stated earlier, this can’t fully replace an actual therapist. It remains an artificial conversation with a lack of human empathy and as of right now, the technology isn’t advanced enough. Yet, various mental health chatbots are currently on the market and have shown to provide relief to stress, anxiety, depression and trauma. Mental health chatbot Woebot provides cognitive behavioural therapy through brief, daily conversations and tracks mood. After a two week usage period, the depression levels and anxiety levels of Woebot users decreased significantly. Once again, while while this isn’t a full replacement to treatment or support from a mental health practitioner, satisfaction for chatbots is high and preliminary efficacy looks very promising. It is an alternative worth considering when better options aren’t available. This can also be said for people suffering from social isolation. While an AI companion isn’t a substitute equivalent to real human interaction, it remains an alternative worth exploring.

Final Remarks

AI-based chatbots and conversational AI have its possible dangers. That is for certain and can be said about any new technology. However, it also has the potential to create a truly positive impact. It has the potential to possibly help fix problems we are currently facing in our society. Users need to be aware of the potential risks technology may bring but fearing or banning it isn’t a solution either. Instead, AI implementers must work with foresight and care; manage potential risks and try to keep them at a minimum while developing technology that can pave the way for a better future – a future where AI continues to help people and foster relationships rather than destroy them.

Leave a Reply

Your email address will not be published. Required fields are marked *