Welcome to the bot-centric future, which is set to make smartphone users — i.e. almost everyone in the Western hemisphere — navigate the internet in a chit-chat fashion with a virtual assistant.
But “assistant” will soon become too impersonal. Alexa, Siri and others will cross the line from impersonal robots to entities that know our habits, routines, hobbies and interests just as well as, if not better than, our closest friends and relatives. What’s more, they’re always with you and there for you, available at the touch of a button.
For companies, this is a winning formula. Smartphone users have proved they are only willing to download and spend time in a limited number of apps. As such, businesses might be better off trying to connect with consumers in the apps where they are already spending plenty of time.
And a bot can potentially provide greater convenience than apps and web searches because it can understand natural speech patterns — and provide the personal touch in an otherwise impersonal user interface.
Such a process has profound psychological ramifications. When interacting with chatbots, our brain is led to believe that it is chatting with another human being. This happens as bots create a false mental perception of the interaction, encouraging the user to ascribe to the bot other human-like features they do not possess. This may seem alien, but this attribution of human characteristics to animals, events or even objects is a natural tendency known as anthropomorphism, and has been with us since prehistoric times.
Computers have always been a favorite target for such anthropomorphic attributions. Since their advent, they have never been perceived as mere machines or simply the result of interaction between hardware and software. After all, computers have a memory and speak a language; they can contract viruses and act autonomously. In recent years, the personal characteristics element has been increasingly strengthened in an effort to present these inanimate objects as warm and humanoid.
However, increased “humanization” of chatbots can trigger a crucial paradigm shift in human forms of interaction. This comes with risks — and the results may be anything but soft and fuzzy.
Related Article: GPT-3 Is Impressive, But it Isn't Artificial General Intelligence
Chatbot Interactions Can Have a Negative Influence on Our Other Interactions
As human beings, our brains have an inherent tendency to prefer simplification over complexity. Computer interaction fits this perfectly. Founded on the premise of minimal or constrained social cues, most of which can be summed up in an emoticon, it does not require much cognitive effort.
A chatbot doesn’t need the emotional involvement and interpretation of nonverbal cues required by humans, thus making our interaction with it much easier. This goes hand in hand with our brain’s tendency toward cognitive laziness. Repeated interactions with chatbots trigger the constructions of a new mental model that will inform these interactions. It will be experienced as a different state of mind from which we interpret social interactions.
When a human being interacts with another human being — for example, a friend — we are driven by the desire to take part in a shared activity. Communication with a bot is different. The gratification derives from a change of mental state, a sort of detachment. You can achieve your goal (getting help, information, even a feeling of companionship) with no immediate “cost.” No investment is required: there’s no need to be nice, to smile, be involved or be emotionally considerate.
It sounds convenient, but the problem arises when we become addicted to this form of bot interaction and slowly start developing a preference for “easy communication.” This can lead to secondary problems.
Related Article: A Good Chatbot Is Hard to Find
The Illusion of Companionship Without the Demands of Friendship
Chatbots are plagued by our primitive needs and desires. Our basic urges derive from the lower-level areas of the brain, such as the limbic system, which is involved in emotions and motivation. Studies found that users expected an asymmetric relationship in which they were in the dominant position.
There are power differences in many real-life relationships. According to Diana Jackson-Dwyer, power refers to a capacity of influencing another’s behavior, making demands and having those demands met. When interacting with bots, people expect to have more power than the other side, to feel they can control the interaction and lead the conversation to whatever places they feel like.
Unconsciously this makes them feel better about themselves and gain back a sense of control over their lives. In other words, in order to boost our self-esteem, we have a hidden desire to hold at least one power-driven relationship in our life. There is no better candidate for this relationship than chatbots.
But in developing robots that are specifically designed to be companions, people experience artificial empathy as though it were the real thing. Unlike real humans, who can be self-centered and detached, chatbots have a dog-like loyalty and selflessness. They will always be there for you and will always have time for you.
The combination of intelligence, loyalty and faithfulness is irresistible to the human mind. Being heard without having to listen to the other person is something we implicitly crave. The danger is that such interactions with chatbots could lead to a preference among some for relationships with artificial intelligence rather than with fallible and sometimes unreliable human beings.
Related Article: Do Chatbots Dream of Electric Sheep?
Designing the Future You Want
We’re designing technologies that will give us the illusion of companionship without the demands of friendship. As a result, our social lives could be seriously impeded as we turn to technology to help us feel connected in ways we can comfortably control.
Bots are undoubtedly useful, and can greatly assist us in the digital sphere. Moreover, fine-tuning technological processes with human psychological concepts helps us make leaps in our knowledge and business practices.
However, it’s important to maintain barriers — for seasoned CEOs and particularly for the younger generation of business leaders. The tablet-addicted toddlers entertained by “nanny bots” may grow up to be moody teenagers who turn to crowd-pleasing cyber-buddies instead of resolving issues with real friends. In adulthood, no amount of technological prowess will teach them the most crucial, timeless and vital business practice of all: establishing a genuine, personal and sincere rapport with your clients and customers.