LivePerson Voice & Video is a SDK (Software Development Kit) for the Apple iOS platform. In order to integrate with us, you need to have an app of your own to which you have full source code access. Basic programming knowledge is required.


The SDK requires our LivePerson Messaging SDK integrated into your app. Your consumers will always engage in a messaging conversation first, before your agents choose to escalate the conversation to a Voice, Video or CoBrowse session.

Supported iOS Versions

iOS Version Support Limitations
8.x or less not supported -
9.0 + supported, with limitations limited call notifications
10.0 + supported -

Supported Devices

  • All iPhone models starting from iPhone 5s
  • iPhone 4s/5 are not officially supported, but may work
  • iPads are currently not officially supported, but may work
  • iPods are not supported

Supported Programming Languages

Only native applications written in the either or both programming languages:

  • Swift
  • Objective-C

Cross-platform apps using native wrappers (e.g. Cordova) can be made to integrate with voice & video support with some additional setup effort. Remote-app control however is only possible on native UI components (like those generated by React Native or Titanium). Neither are currently officially supported.

Other Features

Feature Support Comment
Bitcode (currently) not supported Set Enable Bitcode in your Build Settings to NO

Installing the SDK Manually

To add the SDK manually follow these steps for your app's Xcode Project:

Step 1: Copy Dependencies

  • Copy the LPCoAppSDK.framework to your project's file directory.
  • Next add the LPCoAppSDK.bundle contained within the LPCoAppSDK.framework to your target's Build Phases → Copy Bundle Resources step

Add the following Frameworks to your build target's Build Phases → Link Binary With Libraries option.

  • LPCoAppSDK.framework
  • GLKit.framework
  • VideoToolbox.framework


Step 3: Adjust Build Settings

Under your target's Build Settings, adjust the following:

  • Add the directory containing LPCoAppSDK.framework to Framework Search Path
  • Add the following line to Library Search Paths:
    • $(FRAMEWORK_SEARCH_PATHS)/LPCoAppSDK.framework/**
  • Add the following flags to your Other Linker Flags setting:
    • -lc++
    • -lWebRTC

Important: Ensure that your iOS Base SDK is set to 9.0 or higher.

Now continue with the Project Settings.

Project Settings

These settings must be adjusted in your Xcode project.

  1. Bitcode Compilation
  2. VoIP Background Mode
  3. Privacy Info (iOS 10)
  4. Ring Sound

Bitcode Compilation

Currently the SDK does not support Bitcode Compilation. In your Build Settings choose:

  • Enabled Bitcode : NO

VoIP Support

In the app Capabilities section, enable Background Mode, and check the following options:

  • Audio, AirPlay and Picture in Picture
  • Voice over IP
  • Remote notifications


Your app will now be able to receive background calls and continue voice conversations while the app is running in the background.

Privacy Info (iOS 10)

Starting from iOS 10, you are required to provide a description of why you need access to the Microphone and Camera. This is essential, as otherwise your app may crash.

Please add the following keys to your app's Info.plist:

Key Type Value
NSMicrophoneUsageDescription (Privacy - Microphone Usage Description) String Needed for voice calls
NSCameraUsageDescription (Privacy - Camera Usage Description) String Needed for video calls

Settings Privacy

If you wish to localize the user message, create a localized file called InfoPlist.strings and add translations like this:

NSMicrophoneUsageDescription = "Allow for voice calls";
NSCameraUsageDescription = "Allow for video calls";

Also set the key Localized resources can be mixed in your Info.plist to YES

For a complete documentation see: Apple Tech FAQ & Info.plist HowTo

Ring Sound

To add a ring sound for Push-Calls do either of the following:

  • Create a new sound file named ring.caf and add to your project
  • Copy the file ring.caf file from our Sample App project into your app's project

Either way make sure it's included in the bundle resources of your app's target. Hint: caf files are optimized audio files for iOS. You can convert any aiff file using the command below:

afconvert -v -f 'caff' -d aac -s 1 -b 192000 MySource.aif MyOutput.caf

Integration into App

Before you continue, make sure you have completed this step

AND adjusted your Project Settings

Step 1: Header Includes

If you are using a Swift project, add this to your app's Bridging-Header:

#import <LPCoAppSDK/LPCoApp.h>

Note: If you do not have a bridging header yet, simply create a new Objective-C file in your project. Xcode will ask to create a briding header for you. For Objective-C projects, you can directly import the header in your .m file.

Step 2: Code Calls

Add the following to your AppDelegate's application:didFinishLaunchingWithOptions function:

func application(_ application: UIApplication, didFinishLaunchingWithOptions launchOptions: [UIApplicationLaunchOptionsKey: Any]?) -> Bool {
    // ... do your regular app setup here

    if #available(iOS 9, *) { // setup & register for push calls
        LPCoApp.shared().register { token in
            LPMessagingSDK.instance.registerVoipPushNotifications(token: token!)

This is all that is needed to use the the SDK.

Optional: LP Messaging SDK - Integration

To allow users to return to the messaging conversation from within a Voice/Video session, complete the following steps:

  • Add the LPCoAppDelegate interface to one of your app's active view controllers
  • Set itself as delegate
  • Implement a callback function:
class YourViewController: UIViewController,LPCoAppDelegate {
    override func viewDidLoad() {
        LPCoApp.shared().delegate = self;
    func LPCoAppShowMessagingConversation() {
        // push conversation view controller (see Messaging SDK documentation)

Now, whenever users tap on the messaging icon, your callback will be executed, returning them to their original conversation.