Introduction
Every successful Generative AI solution relies on the power of prompts to bridge the gap between human intention and machine creativity. Prompts guide the behavior of Large Language Models (LLMs) and serve as the language through which humans interact with the AI system, instructing it to generate responses.
Conversational Cloud’s Prompt Library is the interface you use to select, create, and manage the prompts in your conversational AI solution. It’s designed to accelerate and simplify these processes, so you can unlock the full potential of Generative AI more easily.
Not familiar with LivePerson’s trustworthy Generative AI solution? Get acquainted and get started in our Knowledge Center.
Key benefits
- Customization that aligns with your brand’s identity: Tailor prompts to reflect your brand’s unique voice and tone.
- Efficiency and cost savings: Quickly deploy Generative AI conversational assistance and LLM-powered bots using pre-built prompts, validated by LivePerson’s data science experts, slashing time to value and operational costs.
- An enhanced consumer experience: Use well-crafted, customized prompts for smoother and personalized responses, driving customer satisfaction and greater containment.
- Self-service changes: You’ve got control. Make prompt changes whenever you require.
Exposure points
Currently, the Prompt Library is available for use in some but not all areas of Conversational Cloud that take advantage of Generative AI.
You can use the Prompt Library to create and manage prompts for Generative AI solutions using:
- Conversation Assist to offer enriched answers to agents
- Conversation Builder to automate enriched answers to consumers
At this time, you can’t use the Prompt Library for solutions using:
- Intent Manager to generate training phrases for an intent
- Automated conversation summaries to generate summaries of ongoing and historical conversations
Language support
The Large Language Model (LLM) used by LivePerson’s Generative AI is a multilingual model, meaning that it can accept prompts and respond in multiple languages. That said, it's most efficient and generally most capable with English data due to the way it was trained. Prompts in English can—and do—work well in non-English contexts.
Prompts in English and Spanish are officially supported. We've tested both with good results (see our best practices).
Feel free to experiment with other languages, but be sure to test to see if this yields high-quality responses.