Match game: Why AI could bring the human touch to the call center
Consumers loath the automated enterprise call center. Nothing is more frustrating than yelling into a phone at an interactive voice recognition (IVR) system that just doesn’t seem to understand what you need. But few realize most enterprises aren’t fans of current automated call center solutions either, often complaining that the technologies are complicated, expensive and frustrating for users.
Part of the issue is call center technology has evolved slowly. Today, nearly 80% of the call center market is still on premise, according to Talkdesk. But experts believe modern call center technologies will soon accelerate the pace of change.
Cloud-based automated call centers that include artificial intelligence technologies could soon help companies achieve better interactions with callers and potentially put an end to caller frustrations. Such solutions may also provide companies relying on call centers an easier way to accomplish contact center automation, streamline operations and potentially save money.
Last month, Amazon Web Services announced Amazon Connect, a self-service, cloud-based contact center service for the enterprise. The service uses Amazon's virtual assistant Alexa to respond to questions over the phone or via text, its Lex chatbot building service and its text-to-speech program Polly. Amazon’s entrance into this space puts the on-premise call center companies on alert.
"When customers have more choice, they tend to migrate faster from old technologies," said Gadi Shamia, COO at Talkdesk. "We have seen it happening in other industries like CRM, and we have seen it happening in the call center space in the last two to three years."
AI in the call center
Some companies see AI as a magic bullet, and envision sending customers to chat directly with bots. But experts say that won’t necessarily be the best use of AI.
"If you ever used AI bots, [such as] Siri, you know how often they get simple questions wrong — now imagine letting it deal with your customers directly," said Shamia.
A better use of AI is likely around routing the customer to the best available agent. With today’s technology, customers can find most of their self-service needs answered via a website or a mobile app, so customers typically only pick up the phone when self-service options fail them. Getting a customer to speak with a trained representative as fast and in an effortless way as possible is a key goal for any customer centric company. In other words, it all comes down to the customer experience.
For example, explains Shamia, when a customer calls a hotel reservation company directly from the company’s app, an AI bot can assess everything it knows about the customer and send her to the appropriate agent without an IVR system or a long wait.
An AI bot can also learn from its mistakes. Every time a call is transferred, the bot can assume it could have been handled better and improve its algorithm over time. As a result, customers get faster and more accurate answers, which could lead to higher customer satisfaction and brand loyalty.
One stumbling block when it comes to call center automation has been the lack of understanding voice activated systems can have for empathy and emotion. Though systems like Siri and Alexa can understand speech, and even speak, they can’t necessarily understand the emotion and tone behind words.
Understanding emotion is especially important in call centers, because callers rarely pick up the phone because they are happy. Researchers at Mattersight, a company that makes behavioral routing software for call centers, recently analyzed over 118,000 customer service calls at 11 large enterprises and found callers exhibited emotional signs of anger 54% of the time during the first half of the call. They also displayed emotional signs of sadness and fear in their speech more than half of the time.
"The challenge with AI today is that ... voice-commanded assistants are not equipped to handle many types of conversations," said Andy Traba, vice president of Data Science at Mattersight.
For example, when customers disagree with their bill, they prefer to talk with an agent to get the issue resolved. In those cases, customers feel more comfortable knowing that the voice on the other end can empathize with and understand them in ways AI cannot.
In response, Mattersight is working with AI designers to reimagine the approach to speech recognition and analyze and potentially automate empathy in AI bots using a NASA-created personality model and speech recognition algorithms. Such algorithms can potentially analyze the tone, tempo, grammar and syntax of speech and then pair the customer with a call center agent that best suited for their personality and current behavior.
Mattersight says knowing the personality and language the customer would like to be communicated with could have huge implications for the overall efficiency of the call center, and AI will be key to accomplishing that, and more.
"In the future, AI will be plugged into all knowledge sources, so it will continuously learn based on a customer’s past activity — what they were doing on the website, what have you previously called about, whether a claim just denied or a late charge just put on a bill, etc.," said Traba. "What this will allow is that when you call into an enterprise, an intelligent routing engine will predict the intent behind the call and then determine the appropriate routing treatment."