Artificial intelligence (AI) remains a relatively new foray for most organisations. When implementing AI projects, organisations often find themselves stuck throughout the journey because they did not have a clear understanding of the process. In this article, we study the stages of chatbot implementation, what each step entails and is required of the business.
Arguably the most fundamental stage of all -- establishing the business case for deploying a virtual assistant (VA) or chatbot. Typically, this starts with being cognisant and identifying a problem statement, which leads to evaluating the possible solutions.
This is the stage where justifications need to be done to calculate the ROI of the project, and to define the target state versus current state. Is the project’s objective to increase revenue, enhance customer experience, or to reduce customer service costs? Each will entail different metrics to measure and compare. Having clear targets will also ensure the alignment of expectations from the c-suite to the executive levels and pave the way for consistent communications. The most critical output here is to determine the Key Performance Indicators (KPIs) that will measure the success of the project.
Subsequently, intensive market research whittles down the options, by looking at what competitors have implemented and what else is available in the market, the company then decides what kind of solution to deploy. This phase could also involve running a proof-of-concept internally or with external vendors to verify the demand.
The inception of a VA project doesn't happen by chance; only businesses that are ready for it take the calculated leap of faith to make it happen. (Unsure if your organisation is prepared for an AI virtual assistant? Read our blog post on Corporate Readiness Assessment here.)
After settling on your use case, the next step is to gather all the existing data you have to help build your AI assistant. Different use cases call for data from various users/sources.
Human resource use case: HR helpdesk ticketing systems, email threads between HR operations executives and employees, instant messaging on corporate messaging tools like Microsoft Teams, Skype and Slack
Customer service use case: call centre call logs, social media channels' inboxes and comments, live chat histories, customer service/support email inboxes and contact forms
Notice how FAQs are not quoted as a good source of truth for data gathering? That's because FAQs are created for informational purposes in a text heavy format. This is not a good representation of how end-users ask their questions, which tend to be shorter, and can be phrased in a variety of ways.
As a rule of thumb, data used to train your AI chatbot should be as representative and accurate to how real end-users will interact with your business. For example, live chat support logs will be much more suitable to be used to train AI VAs as compared to emails or call log transcriptions. This is due to its close reflection of how users are likely to pose questions to the VA, which is commonly called, the “ground truth”.
These data points are then used as training data to seed the AI.
Essentially, Natural Language Processing, NLP, is a subfield in artificial intelligence which enables machines' understanding of unstructured, human language. (If you haven't read our posts on NLP and ML 101, you can get acquainted here.)
In this phase, the two most important things to be aware of are the concepts of Intents and Entities. Think of this step as teaching the VA to make sense of the words thrown at it.
An intent is essentially the end-users' intention when they interact with the VA, i.e. what they want to get out of it. It is usually associated with a verb, or an action.
"I forgot my password, can you help me reset it?"
"How to reset my password?"
"I got locked out of my account, please help me log back in."
Above are examples of the same intention, which is to get help with resetting the user's password.
Entities are keywords or pieces of information within a sentence that can help the VA ascertain what the user is asking for and provide a more meaningful resolution. Common examples are types of nouns, such as product names, types of clothing, and can also be dates, times and locations as shown in the following example:
"I need a travel insurance for my holiday in Bali. I'll be gone from 13 to 20 November."
Travel insurance – Entity: product_type
Bali – Entity: location
13 to 20 November – Entity: start_date and end_date
Together, intents and entities form the basic foundation of how AI understands words using Natural Language Processing.
The next phase addresses how answers are returned to the end-users.
Now that your VA understands, it must learn how to respond. This stage maps outs all the possible answers the chatbot can return; this can range from responses as simple as a direct answer, to orchestrating complex business logics to return personalised ones.
Turning back to the FAQ -- most businesses have a knowledge base or FAQ to start with -- while they may not be good sources for intent gathering, they are great blueprints to shape the content and answers of the chatbot.
Additionally, integrations to the business' backend systems such as ERP, OMS, CRMs or omni-channel systems, will also happen in this phase. Companies have to access the readiness of the current systems to connect to external solutions, such as the building APIs (application programming interface) or a middle layer to allow for connectivity.
This step will provide a giant leap forward for the VA’s capabilities for being able to provide end to end transactions independently.
KeyReply has been lauded by enterprises in healthcare, insurance, retail and logistics for our smooth implementation and integrations. Want to see a quick demo? Send in a request today.
In this step, we conduct rigorous user acceptance testing (UAT) to ensure that the VA is capable on two fronts. First, to answers enquiries in its existing knowledge base, i.e. the content library populated in Step 4. We regularly run automated batch tests on the chatbot with a pre-defined set of questions (Test Set) to ascertain its scope of coverage, or also called, its Comprehension Level.
Second, is how well it achieves the business objectives it was designed to deliver. A well-defined use case in Step 1 will have precise metrics to determine the VA's performance. For example, a sales lead generation chatbot will be measured by how many leads it has successfully collected. To test how well the VA captures leads, test cases will be drawn out to ensure that testers can easily interact with the chatbot and leave their details behind for follow-up. Feedback gathered from these tests will form the basis for the rounds of iteration to improve user experience flow, content changes, and bolster the chatbot's knowledge base.
Once the main framework of the VA is set up by the project team, we enable the Business-As-Usual (BAU) users to operate and train the chatbot. This training will equip business users with skills to manage the backend system, train the chatbot, and make tweaks such as refining contents and answers independently.
For example, the customer service team might learn how to update the answer bank and operate the live chat module, while the marketing department learns how to update the look-and-feel of the chat widget.
A big part of any digital transformation initiative is change management; stakeholder buy-in is absolutely critical. Ensuring that internal and external stakeholders are ready for the impending new technology and have their expectations aligned helps with a successful roll-out.
In internal stakeholder alignment, it is paramount that the c-suite become project evangelists to spread the word and awareness of the VA. The following are examples of activities that happen in different functions to prepare for go-live:
Marketing: inform external customers of the chatbot's launch date and the channels it will be deployed on and educate them on what the VA can do for them.
Corporate Communications & Public Relations: prepare press releases for the official launch. Press releases are especially relevant if the use of VA is not already prevalent in the industry. Internal memos will be circulated within the company.
Human Resources: prepare training materials or a quiz in the company’s internal training site.
At this juncture, the VA is up and running. Improvement in AI performance is a result of consistent training. Post-go-live, business users have to classify incoming utterances (all questions from end-users to the VA) diligently; users' actual interactions are valuable sources of training data that can be used to improve the accuracy of the VA.
Monitoring ensures that we know what end-users are asking the chatbot, and from analysing trends in incoming enquiries, we can determine if there needs to be extension or expansion in the scope of the VA's knowledge and capabilities. Common ways that VAs can scale-up include the following:
While the steps in this article have been written chronologically, the truth looks closer to this:
Implementing an AI VA or chatbot is a highly iterative process. Think design thinking methodologies, where rounds and rounds of iteration happen over short periods of time to ensure a good product-market fit. The same approach is applied to improving overall AI accuracy and performance. A consistent effort from business users to maintain and review data is the key to ensuring long-lasting success.
Find out how we can make chatbot implementation a smooth and seamless experience for you. Ask us for a quick demo today.