Image
Artificial Intelligence
in customer contact

Artificial Intelligence
in customer contact

dr. Arjan van Hessen

Telecats

Will smart systems with Artificial Intelligence make the difference in customer contact over the next 5 years?

Since 2016, AI has been a buzz-word that goes around in "our" world. We have Artificial Intelligence (AI), Machine Learning (ML) and Deep Learning (DL) with Deep Neural Networks (DNN). But what is exactly the difference between these techniques? As used in the popular press, AI is more of a container term used for non-human "devices" that in our eyes have a form of intelligence.

Machine Learning

Machine Learning

Since 2016, AI has been a buzz-word that goes around in "our" world. We have Artificial Intelligence (AI), Machine Learning (ML) and Deep Learning (DL) with Deep Neural Networks (DNN). But what is exactly the difference between these techniques? As used in the popular press, AI is more of a container term used for non-human "devices" that in our eyes have a form of intelligence.

Deep Learning

Deep Learning

With the advent and massive use of Internet 2.0, cloud computing and better algorithms it became possible to use millions of examples from the web instead of hundreds from your own database! One of the first controversial examples (2012) of the use of Deep Learning was the system developed by Google to recognize images of cats. Other large companies such as Microsoft, Amazon, Apple, HP and Facebook also came up with great examples of how Deep Learning (or learning using Deep Neural Networks) is used.

A striking example for “our world” was Microsoft's use of DNNs for automatic speech recognition (ASR, Microsoft/InterSpeech, 2011). In the previous years, the increase of ASR-results was rather limited: it seemed as if we had reached a kind of ceiling in what was possible with ASR. After Microsoft's presentation, everyone switched to the use of DNN’s for ASRs, causing speech recognition to be much and much better.

From the Labs to the Real World

From the Labs to the Real World

In recent years, we have seen AI to appear in the real world. Without too much publicity, existing and new applications are equipped with AI, something that will continue in the coming years. "Artificial Intelligence" will be widely used in an increasing variety of application areas. For example, IBM's Watson is already used for legal issues and health care, Google uses the Image Recognition technology of the “cat images recognition” to recognize friends in your own photo albums, Amazon uses AI for its Alexa (Echo) platform and various car and other companies are experimenting with AI for vehicle support and self-steering. In all these examples we see that AI is used to model human routine. Self-driving vehicles perform most of the time quite good but fail sometimes in unexpected situations outside the "normal" course of events.

Customer contact

Customer contact

One of the areas where AI will undoubtedly be deployed is in the (virtual) human-machine interaction (HMI). In all remote contacts (web, telephone, chat) in which people "want" something from an organization/company, the knowledge of the employees gained through years of experience, plays a major role: after all, they know how to interpret and answer a question, what customers like or dislike and how to best achieve the goals of both the client and the company. This knowledge is not simply available but must be acquired by the employees. This is done through separate training, on-the-job training, help from experienced colleagues and a portion of "common sense". As an employee works longer with the customers, he/she will be able to assess better and faster how to help someone optimally. This “learning process” does not differ fundamentally from AI systems. By collecting as much as possible question-answer combinations and train the AI-engine with them (once again and again), the AI-engine can "learn" how to deal with these questions.

At Telecats we distinguish 4 different, adjacent steps in customer contacts:

Who contacts us? Can we find out who is contacting us by means of a few smart questions and using available data?

Why does he/she contact us? What is the reason that someone seeks contact? Try to find out by asking questions and combine the recognized answers with the available predictions.

Who contacts us? Can we find out who is contacting us by means of a few smart questions and using available data?

Who contacts us? Can we find out who is contacting us by means of a few smart questions and using available data?

The first three steps already give a lot of information. If you know who is looking for contact, you (usually) know more about that person (gender, age, origin, etc.) If you know what the question of the customer is, you can combine all available data to estimate what to do next. However, not all decision suggestions of the AI-system will be correct. So, employees have to check the final decision and adjust them if necessary. Then, these corrected answers are “returned” to the system so it can "learn" how to improve its answers the next time. By doing so the factor “time” is taken into account: new, different answers on “old, excisting” questions and answers to new questions will adapt the system.

What role does AI already play in customer contact?

It is difficult to answer this question exactly because it is not always clear which AI components are used in which customer contact software. What can be done is to give an overview of the possibilities that AI offers. If companies have their data cleaned and actualized, they can use the data for Predictive Analytics to “predict” when certain types of questions will be asked. In addition, AI can help to better handle customers because it may suggest the call centre agents the most likely answers, given a certain question. These improved services are the result of better speech recognition and better routing and/or automatic answering of questions.

What role will AI play in customer contact in the near future?

AI makes it possible to help customers in a much more serious way. 30 years ago, the telephony-based self-service started with the pressing of a telephone key (press key 1 for...) Then, in the 90’s “simple” speech recognition made it possible for callers to address a topic by saying a name (who do you want to speak to?). About 15 years ago it became possible to identify the call reason by just saying it in your own way (Hello, how can we help you...). No longer it was necessary to exactly say the words the application developers had in mind designing of the application. Callers could use their own way of vocal expression to specify why they were calling or what they needed from the company or organisation. But even with 100% correct speech recognition, it was sometimes difficult to give an adequate answer to every question. AI will help because AI-based systems can learn from previous messages how to handle the new ones. It will help to take the last barrier: from recognizing to understanding!

What are the biggest changes by AI for the coming years?

The biggest changes that we will see due to the growing presence of AI is that there will be mass smart systems that will learn from and interact with "us". Health care, financial planning, administrative procedures, travel books, etc. These are all activities that nowadays are done by "people with experience". This experience will be learned by AI systems, after which they initially will support us in our work but, once reached a certain level, may take over at least some of our tasks.

Share this post