Introduction

This article provides reference info on the variables that play a role in different Generative AI solutions within Conversational Cloud.

If you're looking for an introduction to variables and general info on working with variables in the Prompt Library, see this article.

Variables for automated conversation summaries

If you’re working on a prompt to generate conversation summaries via an LLM, the following client type-specific variables are available:

  • {language}: Optional. Include this variable to generate summaries in a specific language. If you omit this variable, the summaries are generated in English.
  • {text}: Required. At runtime, this is resolved to include the conversation's transcript.

Don’t ever remove the {text} variable. If you do, conversation transcripts won't reach the LLM, leading to inaccurate summaries. We expose the variable in the UI, so you can 1) locate it in the prompt wherever you require, and 2) easily add it back if you inadvertently delete it.

Learn more about {language} and {text}.

Variables for Copilot Rewrite

If you're working on a prompt to generate a rewritten version of the agent's message, the following client type-specific variables are available:

  • {text}: Required. At runtime, this is resolved to include the agent's message that needs to be rewritten.

Don’t ever remove the {text} variable. If you do, the agent's message won't reach the LLM, leading to inaccurate responses. We expose the variable in the UI, so you can 1) locate it in the prompt wherever you require, and 2) easily add it back if you inadvertently delete it.

Learn more about Copilot Rewrite configuration.

Variables for Routing AI agents

If you’re working on a prompt for a Routing AI agent, there's just one client type-specific variable available:

  • {intent_list}

{intent_list}

At runtime, this variable is resolved to include the list of routes (their names and descriptions) that are defined in the Guided Routing interaction.

Don’t remove the {intent_list} variable. If you do, the Guided Routing interaction won't work. We expose the variable in the UI, so you can 1) locate the variable in the prompt wherever you require, and 2) easily add it back if you inadvertently delete it.

Variables for solutions enriching KnowledgeAI answers

The variables discussed in this section apply to Conversation Assist and Conversation Builder KnowledgeAI agents. Both make use of prompts that instruct the LLM to enrich the answers returned by KnowledgeAI.

If you’re working on a prompt for a Conversation Assist rule or a KnowledgeAI agent, the following client type-specific variables are available:

  • {brand_name}
  • {brand_industry}
  • {knowledge_articles_matched}

{brand_name} and {brand_industry}

We recommend that you follow the example in our prompt templates and use {brand_name} and {brand_industry} in your prompts. Research by our data scientists has revealed that this helps the response to stay in bounds, i.e., specific to your brand, with fewer hallucinations.

When you activate our Generative AI features in the Management Console, we ask you to specify your brand name and industry for this reason. At runtime, the values that you specified are used as the values of these variables. Return to the Management Console to change the values at any time.

{knowledge_articles_matched}

At runtime, this variable is resolved to include the list of articles that matched the consumer's query. Learn about the matched articles.

Research indicates that the positioning of this variable can strongly influence the generated response, so place this variable wherever you require. Many brands put this variable at the end of the prompt.

Don’t remove the {knowledge_articles_matched} variable. If you do, KnowledgeAI's answer enrichment service won't work. We expose the variable in the UI, so you can 1) locate the variable in the prompt wherever you require, and 2) easily add it back if you inadvertently delete it.

If you're accessing a prompt via Conversation Assist, you won't see the {knowledge_articles_matched} variable exposed in the UI yet. But, rest assured that it's in your prompt. Change its location within the prompt's instructions, manually, as you require.