The nuances that occur between people are complex, and it is these nuances that separate humans from machines. This begs the question – will emotionally intelligent machines ever truly exist? Neil Hammerton investigates
Emotional intelligence is defined as the ability to recognise, understand and manage our own emotions, and recognise, understand and influence the emotions of others. It’s one key quality that we expect when speaking with others about any issues we have, including customer service agents.
Being able to understand another’s emotions is undeniably a vital and defining characteristic for anyone looking to be successful in a customer-facing role.
But as we enter a new digital era, artificial intelligence (AI)-supported technology like chatbots are increasingly automating more of the customer experience. And more is on its way, from AI that can decipher good from bad calls, to machine learning that can translate phone conversations in real time.
But since customer experience and emotional intelligence are seemingly inextricable, it is surely a must that any AI technology designed for the customer service industry is able to display emotional intelligence of an equally high standard to successful human operatives. But have we yet reached this point?
A long way to go
While machines are getting better at understanding human emotional context clues – and are improving all the time – there are still plenty of limitations, particularly in the mass market. This is supported by a recent study of Artificial Emotional Intelligence published in the International Research Journal of Advanced Engineering and Science, which states that the current research and development activities on Al based systems do not have enough data on the emotional aspects of human intelligence.
We can see this borne out in human responses to sentiment analytics tools, which one in five of respondents to a recent Natterbox survey use as part of their role. These tools are undoubtedly valuable in enabling us to understand the prevailing sentiment and trends within large pools of data – for example, conversations around a brand on social media, or the emails received in response to a specific offer.
But isolated, more nuanced statements can sometimes get lost in the noise of sentiment analytics. To give one example: When even many human internet users misunderstand the use of irony in text, how can we expect a programme to spot it?