A Blog by Jonathan Low

 

Feb 1, 2017

Call Center Workers Are Receiving Coaching From Socially Aware A I Software

Machine learning may make devices more sensitive than humans to moods - and the attendant opportunities they offer. JL

Will Knight reports in MIT Technology Review:

Software is getting better at analyzing social interactions and emotions thanks to machine learning techniques and training data. Call-center workers are receiving real-time coaching from software that analyzes their speech and their interactions with customers. As they are talking the software might recommend they talk more slowly or interrupt less often. “Providing feedback on the mental states that they might be inducing in other people seems valuable."
Next time you call customer support, the person on the other end of the line may be getting a little help from emotionally intelligent AI software.
Some call-center workers are now receiving real-time coaching from software that analyzes their speech and the nature of their dialogue interactions with customers. As they are talking to someone the software might recommend that they talk more slowly or interrupt less often, or warn that the person on the other end of the line seems upset.
This gives us a fascinating glimpse of how AI and humans might increasingly work together in the future. Plenty of routine work is becoming automated in call centers and other back office settings, but real human interaction seems likely to resist automation for a long while yet. Even so, AI software may change the way people interact with customers by serving in an advisory capacity.
The call-center software is supplied by Cogito, a company based in Boston. Its software automatically assesses the dynamics of a conversation, and has been trained to recognize certain pertinent characteristics. Rather than the substance of a conversation, it analyzes the raw audio. “Conversation is like a dance,” says Josh Feast, CEO of Cogito. “You can tell whether people are in sync, and it turns out this is a much better measure than language.”
Call centers have long sought to analyze customers’ voices for signs of agitation or frustration. Software is getting much better at analyzing social interactions and emotions thanks to clever new machine-learning techniques and copious amounts of training data.
Feast says his company has found that call centers do not want to replace phone workers, but they are keen to improve the way they operate. “Humans are social beings,” he says. “We engage with each other for emotional reasons, and we want somebody to help us, to counsel us.”
Feast founded Cogito, in 2007, with Sandy Pentland, a professor in the MIT Media Lab who specializes in studying human dynamics. The company originally developed its technology with funding from DARPA as a way to detect a person’s mental state using his or her speech.
Some companies say Cogito has helped improve the performance of their call center staff. The health-care company Humana developed a tool for call center staff using Cogito’s technology and saw a 28 percent improvement in a commonly used measure of customer satisfaction. Feast says employees working with the software typically report greater job satisfaction, too.
Following such success in customer service, Cogito is working on a platform that could see the technology deployed much more widely. Feast says it could be built into videoconferencing software, or used as an aid during business negotiations. He speculates that it might even help in marriage counseling.
“Providing feedback to individuals on the mental states that they might be inducing in other people seems valuable,” says Peter Robinson, a professor at the University of Cambridge, U.K., who studies human-computer interaction. “There are many other applications for this sort of technology.”
But Robinson says it will be important not to rely on such a system too much. “These social signals are at best ambiguous and at worst distinct for different people,” he says.
Rosalind Picard, a professor at the MIT Media Lab who has pioneered emotion tracking (see “Thinking About Emotional Machines”), agrees that it can be problematic to develop such technology. “I think it depends how the interface is built,” Picard says. She notes, for instance, that different people often have different conversational styles. “Many New Yorkers practice a ‘high interruption’ style,” Picard says. “Interrupting can thus be likeable and build rapport with them. But the same behavior with some other callers could be seen as rude.”

0 comments:

Post a Comment