Will machines ever gauge our emotions?
Cognitive neuroscientist Katri Saarikivi examines emotional intelligence and what it means in the digital age.
Be honest. Do you ever swear at your computer when it does not seem to understand what you want? Talk back to your navigator when it gives you advice?
Most people answer “yes” to these questions. It seems that we express emotions to our digital tools on a regular basis. There’s just one problem. They do not care.
Because they can’t!
Our tools are becoming smarter every year, superseding human competence. However, they are still lacking in one realm of human capacity: emotional intelligence.
This is a problem, because according to current scientific understanding, emotions are on all the time, present in all kind of human thinking, decision-making, and action.
For instance, in well-known studies of cognitive biases, it was shown that if German music was playing in a wine shop, customers bought more German wine, if French music, more French wine. Unwittingly.
Therefore, if we truly want to understand each other as well as our own actions, we need emotional intelligence. And if the machines we interact with do not understand our emotions, they are lacking highly salient information about the user.
If we truly want to understand each other, we need emotional intelligence.
The most obvious problems resulting from this emotional ignorance of machines occur in computer-mediated interaction. When we interact face-to-face, things like body language, facial expressions, and tone of voice all provide a wealth of information about our emotions. These high-speed signals intertwine with the literal content of the message, providing others with insight into whether we are serious, joking, sarcastic, moved, or indifferent.
A significant amount of this information is lacking when interaction is mediated digitally. Instead of dynamically changing gestures and expressions, we must make do with GIFs and emojis.
This results in a difficulty to understand what others actually mean. Online discussions easily become unnecessarily heated, cyberbullying is a growing concern, and virtual work teams are rarely as effective as ones that interact head-on.
What to do?
The field of affective computing is trying to fix this by teaching machines to read and respond to human emotions. For example, machine vision algorithms are becoming better at reading our facial expressions, and sensor technology is allowing for real-time measurement of physiological reactions related to emotions.
Based on these advances, we can train chatbots to seem empathic, robots to gesture emotionally, and slowly broaden the emotional bandwidth of the internet. Perhaps one day your computer will finally understand your tone of voice when you say NOT NOW to the update that starts without asking just as you’re in a hurry to log off.
However, for the time being, it’s best to keep in mind these shortcomings of our intelligent machines and rely on the one cognitive capability that machines are still having a hard time beating us at: empathy.
Main photo Andy Kelly