Our Blog

where we write about the things we love

18

Jun

The next big feature for Unified Communications: real-time interpretation of emotion through voice

A company called eXaudios Technologies has developed a new technology which interprets a person’s emotional state in real-time, based on how they speak. You can view their six-minute demonstration here.

This technology has been developed for the Unified Communications market and in particular for the call centre, but it can also be applied to so many other aspects of human interaction.

From the moment I saw the demonstration, I immediately started to think about the evolution of telecommunications and how technology offers new methods through which people can communicate more effectively.

In Unified Communications, ‘Presence’ is the feature that allows people to publish and to view someone’s availability as well as the context of their availability, for example available, busy, away, in a meeting, on sick leave, on holiday, offline, on mobile etc. Presence helps us communicate more effectively because it allows us to modify our choices of how and when we communicate. For example, if you are in a meeting, people who want to reach you will normally chose to wait for your presence indicator to change to available, because they know that you most likely won’t be reading or replying to emails and you will not be taking calls either.  

This change in behaviour reduces your stress because you are not dealing with your phone vibrating or ringing throughout your meeting, and also reduces the stress of other people needing to contact you. Psychologically speaking, in having a sense of certainty, the human mind becomes less stressed. So it makes sense that Presence has the capacity to reduce our stress in a very subtle way by creating a sense of certainty around people’s availability and their preferred method of contact.

In addition to having context around someone’s availability, now with this new technology from eXaudios Technologies we can have a sense of people’s emotional state. This can be perceived by people as a positive or a negative, just like some people feel about Presence. In my opinion, any tool that can help us prioritise our efforts better (therefore savings us from wasting time), or a tool that can help us be more considerate of others (reducing how much we contribute to their stress load) is a tool with positive applications.

If you look at the video demonstration, you can see how this technology can be very useful for a call centre environment, as it allows call centre agents to have another tool to better understand the person they are speaking with. Ultimately, understanding and emotional intelligence allow us to empathise, and help us adjust how we communicate in order to better connect with the person we are speaking with, and to arrive at some form of resolution, hopefully through a less stressful engagement. In essence, good customer service needs to have empathy as its foundation.

If this technology was implemented pervasively throughout enterprises as part of their standard Unified Communications solutions, I wonder if it would make staff in general become more empathetic to one another, as we would have a sense (during phone conversations only) of how people were feeling when they were talking to us.  Or, on the other hand, I wonder if it would distract from the subject of the conversation as our focus shifts between what is being said and what the technology interprets about how the words are being articulated.  Will it make us more critical of how we communicate, or will it force us to learn to communicate more pleasantly and with greater effectiveness? 

I think we already have the opportunity to be more empathetic or to communicate more pleasantly – it’s our choice. The only difference is that this technology can give us real-time feedback, which some people may or may not appreciate. At the end of the day it will challenge us more as humans to be more empathetic. It will challenge us more as communicators to speak using words that convey positive emotions which embody what we want to feel and what we want others to feel.

Since this technology interprets how the human voice translate to emotions, I wonder if, in 30 or 50 years’ time, it will evolve in a direction that will help us better understand the emotional state of babies based on the sounds they make. Or help us understand how animals feel, or how they communicate with other animals or with humans. Or whether in the medical industry doctors will be able to diagnose particular conditions based on how our hearts or digestive systems sound.

Well done eXaudios Technologies and best of luck with your technology.

I would love to hear your views and thoughts on this subject (you can reach me at david.porta@intergen.co.nz).  Where do you think all this might take us? And is it a good thing or a bad thing?

Posted by: David Porta, Service Line Lead, Unified Communications | 18 June 2010

Tags: Unified Communications


Related blog posts


Blog archive

Stay up to date with all insights from the Intergen blog