A Blog by Jonathan Low

 

Sep 17, 2016

Customer Service Bots Are Getting Better At Detecting Your Agitation

Whether they are getting better about then doing something about it remains to be seen. JL

Signe Brewster reports in MIT Technology Review:

[Humans] change our behavior in reaction to how whoever we are talking to is feeling or what we think they’re thinking. The system is designed to identify emotional state based on a variety of cues, including typing patterns, speech tone, facial expressions, and body movements.
SRI International, the Silicon Valley research lab where Apple’s virtual assistant Siri was born, is working on a new generation of virtual assistants that respond to users’ emotions.
As artificial-intelligence systems such as those from Amazon, Google, and Facebook increasingly pervade our lives, there is an ever greater need for the machines to understand not only the words we speak, but what we mean as well—and emotional cues can be valuable here (see “AI’s Language Problem”).
"[Humans] change our behavior in reaction to how whoever we are talking to is feeling or what we think they’re thinking,” says William Mark, who leads SRI International's Information and Computing Sciences Division. “We want systems to be able to do the same thing."
SRI is focused first on commercial partners for the technology, called SenSay Analytics.
The system is designed to identify emotional state based on a variety of cues, including typing patterns, speech tone, facial expressions, and body movements.
SenSay could, for example, add intelligence to a pharmacy phone assistant. It might be able to tell from a patient’s pattern of speech if he or she were becoming confused, then slow down.
The machine-learning-based technology is trained on different scenarios, depending on how it will be used. The new virtual assistants can also monitor for specific words that give away a person’s mental state.
It works via text, over the phone, or in person. If someone pauses as he or she types, it could indicate confusion. In person, the system uses a camera and computer vision to pick up on facial characteristics, gaze direction, body position, gestures, and other physical signals of how a person is feeling.
While virtual assistants are becoming more common on our personal devices and in our customer service interactions, the technology is still limited. Most people still use voice-controlled interfaces for only the simplest tasks. Amazon recognizes that and is working on injecting emotional intelligence into Alexa, the virtual assistant powering its home robot Echo (“Amazon Working on Making Alexa Recognize Your Emotions”). And earlier this year Apple acquired Emotient, a startup that built technology that can analyze facial expressions, which could end up finding its way into Siri.
Since Apple acquired Siri in 2010, SRI International has been thinking about what comes next. It recently spun off Kasisto, which makes an artificial-intelligence platform trained to complete digital tasks such as transferring money or answering customer questions.
The lab’s research also extends into the Internet of things—it’s experimenting with how to bring virtual assistants to the smart home and other connected spaces.
Mark says bots can build trust by performing well, but also by explaining what they are doing or saying. That can reassure users a bot understood them and is completing the requested task.

0 comments:

Post a Comment