This is a tutorial series that's cumulative in nature. To start from the beginning, start here.
At LivePerson, we’re thrilled that advancements in Natural Language Processing (NLP) and Large Language Models (LLMs) have opened up a world of possibilities for Conversational AI solutions. The impact is truly transformative.
In this lesson, you add support for answering FAQs to the Voicebot. And you use a Large Language Model (LLM) to enrich those answers via Generative AI. The result? Smarter, warmer, and better answers. And an experience that’s more human-like.
Learn about LivePerson’s trustworthy Generative AI solution.
Step 15: Add a knowledge base
- Open the menu on the left side of the page, and select Automate > KnowledgeAI.
- Click Add knowledge base in the upper-right corner.
- On the AI Enabled tab of the window that appears, specify the following:
- Knowledge base name: Enter “Pharmacy FAQs.”
- Content source: Select “CSV.” Then upload the FAQs CSV in the Resources ZIP that you downloaded at the start of the tutorial.
- Click Save.
Step 16: Test the article matching
- Click Articles in the menu at the top of the page.
- In the Answer Tester on the right, ensure the Enriched answers via Generative AI toggle is turned on.
-
Enter a few queries to test the article matching:
- What are your pharmacy hours?
- Do you accept insurance?
- Do you give flu shots?
- Click Get answers.
You can ignore the warning in the UI about the default value for the Enrichment prompt. For the purpose of this tutorial, the default prompt will work just fine. In a real world scenario, you should select and use one of your own prompts as soon as possible. (You can copy one of LivePerson's if you choose.)
The Enrichment prompt is the prompt to send to the LLM service when the consumer’s query is matched to articles in the knowledge base. It instructs the LLM service on how to use the matched articles to generate an enriched answer.
Step 17: Integrate the knowledge base into the bot
- Navigate back to Conversation Builder.
- Open the Pharmacy bot, and select Integrations on the menu bar.
- Click Add Integration in the upper-right corner.
-
Specify the following:
- Integration Name: FAQSearch
- Response Data Variable Name: FAQSearch
- Integration Type: KnowledgeAI (Learn about this integration type.)
- Knowledge Base: Pharmacy FAQs
- Click Save.
- Click Dialogs on the menu bar, and open the Fallback dialog.
-
Delete the one interaction in the dialog using the Delete option in the interaction’s menu.
-
Open the interaction tool palette on the right, and add an Integration interaction to the dialog.
-
Name the integration interaction “FAQ search,” and select the “FAQSearch” integration that you just created from the dropdown list.
-
Turn on the Enriched answers via Generative AI toggle.
Here again, you can ignore the warning in the UI about the default value for the Enrichment prompt. For the purpose of this tutorial, the default prompt will work just fine. In a real world scenario, you should select and use one of your own prompts as soon as possible. (You can copy one of LivePerson's if you choose.)
Next you select a prompt for the No Article Match prompt. This is the prompt to send to the LLM service when the consumer’s query isn’t matched to any articles in the knowledge base. It instructs the LLM service on how to generate a response (using just the conversation context and the prompt).
If you don’t select a No Article Match prompt, then if a matched article isn’t found, no call is made to the LLM service for a response. But for this tutorial, you'll select one in the next step, so you can see how this provides support for small talk.
-
For the No Article Match prompt, copy a LivePerson template, and select to use it as this prompt:
-
On the face of the interaction, click + Select prompt.
- In the pop-up window, click Yes, continue.
- Click Prompt templates.
-
Select the "Fallback No Article Match EN - Conversation Assist" template.
- In the read-only window that appears, review the template, and click Use template.
-
In the editable window, change the name of the prompt to "NoArticleMatch EN - Voicebot". Click Save.
-
-
Still in the Integration interaction, click the Custom Code icon.
- Select the Post-Process Code tab.
-
Enter the following code, and click Add Script.
var articleText = botContext.getBotVariable("FAQSearch.article"); var escapedText = escapeSpecialCharacters(articleText); botContext.printDebugMessage("Article text = " + escapedText); botContext.setBotVariable("articleText", escapedText, true, false);
This code takes the answer from the knowledge base, and uses the
escapeSpecialCharacters
function to replace certain special characters with their corresponding HTML entities. This helps to ensure that when the answer is displayed/played, the special characters are properly rendered/handled, with no unintended side effects.You will add the
escapeSpecialCharacters
function to the bot’s Global Functions soon.This code then saves the processed answer in a bot variable named
articleText
. -
Add a speech statement below the FAQ search integration. Name it, “FAQ search success.” For the statement, enter the following:
{$botContext.articleText}
And set its Next Action, so it directs the flow to the “Ask if anything else” question in the “Anything else” dialog.
-
Beneath the success statement that plays the answer from the knowledge base, add a speech question to handle a FAQ search failure:
- Name it “FAQ search fail.”
- Enter the statement: “I’m sorry! That’s not something I can help with. Could you try again?”
-
Go into the Advanced tab in the interaction settings, and set the Elapsed time before prompt to “7000”.
-
Return to the Integration interaction at the top of the dialog. Add a custom rule where the condition is “API Result” matches “Success.” If the condition is true, the flow should go to the next action.
-
Add another custom rule named “Failure,” where the condition is “API Result” matches “Failure.” If the condition is true, the flow should go to the “FAQ search fail” interaction.
Step 18: Add a global function
Click Global Functions, and add the following code above the initConversation
method.
function escapeSpecialCharacters(message) {
var escapedMessage = message.replace(/&/g, "&").replace(/</g, "<").replace(/>/g, ">");
return escapedMessage;
}
Click Save.
(Learn about global functions.)
Step 19: Test the automated answers
Now let’s see the answers enriched via Generative AI in action.
-
Return to the Dialogs page, and use Preview to ask a few intentful questions, such as:
- What are your pharmacy hours?
- Can I transfer my prescriptions?
- Do you give flu shots?
By default, Voicebots support context switching, so you should be able to ask an FAQ anywhere in the bot flow. The bot will answer the FAQ and then return you to the previous flow.
-
Try some small talk too:
- Hi, how are you?
- What’s up?
What's next?
Continue on to the next tutorial in the series.