How to embed a voice assistant into an iOS App(Swift and Objective-C)

Alan AI
7 min readNov 14, 2020

You can create a voice assistant or a chatbot for an iOS app using the Alan iOS voice SDK. In this tutorial, we’ll create a single view iOS app with Alan voice and test drive it in a simulator. The app users will be able to use a work word or tap the Alan button to interact with the voice assistant.

From this tutorial you’ll learn

  • How to add a voice interface to an iOS app
  • How to write simple voice commands for an iOS app

The following prerequisites needed to get started

Step 1: Create an iOS app with a single view

For this tutorial, we’ll be using a simple iOS app with a single view. Follow the instructions below to create the app:

  1. Open Xcode and select to create a new Xcode project.
  2. Select Single View App.

3. In the Product Name field, type in the project name.

4. From the Language list, select Swift.

5. From the User Interface list, select Storyboard.

6. Select a folder in which the project will reside and click Create.

Step 2: Add the Alan iOS SDK to the project

There are two ways to add the Alan iOS SDK to the app project: with CocoaPods or manually. Add the SDK manually.

  1. Open Alan GitHub, go to the Releases page for the Alan iOS SDK: https://github.com/alan-ai/alan-sdk-ios/releases.
  2. From the latest release download the AlanSDK.framework_<x.x.x>.zip file .

3. Extract AlanSDK.framework from the ZIP archive, on the computer.

4. Drag AlanSDK.framework and drop it under the root node of the Xcode project.

5. In the displayed window, select the Copy items if needed check box and click Finish.

The AlanSDK.framework node will appear in the project tree.

Step 3: Specify the Xcode project settings

Project settings need to be adjusted so that we can use the Alan iOS SDK.

  1. When the app gets built need we need to make sure the Alan iOS SDK is embedded . On the General tab, scroll down to the Frameworks, Libraries, and Embedded Content section. To the right of AlanSDK.framework, select Embed and Sign.

2. In iOS, the user must explicitly give permission for an app to access the microphone. We need to add a special key with the description for the user why the app requires access to the microphone in the Xcode project.

a. Then go to the Info tab.

b. In the Custom iOS Target Properties section, hover over any key in the list displayed and click the plus icon to the right.

c. Select Privacy — Microphone Usage Description from the list.

d. Provide a description for the added key in the Value field to the right. This description will be displayed to the user when the app is launched.

3. We need to allow the background mode for our app. Go to the Signing and Capabilities tab. In the top left corner, click + Capability and in the capabilities list, double-click Background Modes. In the Modes list, select the check box in front of Audio, AirPlay, and Picture in Picture.

4. Ensure that the background mode is enabled in our Alan project. In the Alan Studio, at the top of the code editor, click Integrations, click on the iOS tab and enable the Keep active while the app is in the background option.

5. In Xcode, go to the Build Phases tab. In the top left corner, click the + button and select New Run Script Phase. Add the following in the added Run Script section:

sh "${BUILT_PRODUCTS_DIR}/${FRAMEWORKS_FOLDER_PATH}/AlanSDK.framework/frameworks-strip.sh"

This script removes unnecessary architecture slices, which reduces the final package size.

Step 4: Integrating Alan with Swift

The next step is to update our app to import the Alan iOS SDK and add the Alan button to it.

  1. In the app folder, open the ViewController.swift file.
  2. At the top of the file, import the Alan iOS SDK:
import AlanSDK

3. In the ViewController class, define variables for the Alan button and Alan text panel:

class ViewController: UINavigationController {
...
/// Alan button
fileprivate var button: AlanButton!
/// Alan text panel
fileprivate var text: AlanText!
...
}

4. To the UIViewController class, add the setupAlan() function. This helps to set up the Alan button and Alan text panel and position them on the view:

class ViewController: UINavigationController {
...
fileprivate func setupAlan() {
/// Define the project key
let config = AlanConfig(key: "")

/// Init the Alan button
self.button = AlanButton(config: config)

/// Init the Alan text panel
self.text = AlanText(frame: CGRect.zero)

/// Add the button and text panel to the window
self.view.addSubview(self.button)
self.button.translatesAutoresizingMaskIntoConstraints = false
self.view.addSubview(self.text)
self.text.translatesAutoresizingMaskIntoConstraints = false

/// Align the button and text panel on the window
let views = ["button" : self.button!, "text" : self.text!]
let verticalButton = NSLayoutConstraint.constraints(withVisualFormat: "V:|-(>=0@299)-[button(64)]-40-|", options: NSLayoutConstraint.FormatOptions(), metrics: nil, views: views)
let verticalText = NSLayoutConstraint.constraints(withVisualFormat: "V:|-(>=0@299)-[text(64)]-40-|", options: NSLayoutConstraint.FormatOptions(), metrics: nil, views: views)
let horizontalButton = NSLayoutConstraint.constraints(withVisualFormat: "H:|-(>=0@299)-[button(64)]-20-|", options: NSLayoutConstraint.FormatOptions(), metrics: nil, views: views)
let horizontalText = NSLayoutConstraint.constraints(withVisualFormat: "H:|-20-[text]-20-|", options: NSLayoutConstraint.FormatOptions(), metrics: nil, views: views)
self.view.addConstraints(verticalButton + verticalText + horizontalButton + horizontalText)
}
...
}

5. Now, in let config = AlanConfig(key: ""), define the Alan SDK key for the Alan Studio project. To find the key go to Alan Studio, and at the top of the code editor, click Integrations and copy the value from the Alan SDK Key field. Then insert the key in the Xcode project.

6. In ViewDidLoad(), call the setupAlan() function:

class ViewController: UINavigationController {
...
override func viewDidLoad() {
...
self.setupAlan()
}
...
}

Now, the Xcode project looks as shown below:

Now, run the app. Below you can see our app running on the simulator. The app displays an alert to get microphone access with the description we have added in the script:

Step 5: Adding voice commands to interact with Alan

Open the project in the Alan Studio and in the code editor, add the following intents:

intent (`What is your name?`, p => {
p.play(`It's Alan, and yours?`);
});

intent (`How are you doing?`, p => {
p.play(`Good, thank you. What about you?`);
});

Now, tap the Alan button and ask: What is your name? and How are you doing? Alan will give respond with the intents we have provided.

What you finally get after going through this tutorial

You will have a single view iOS app integrated with Alan, after you pass through this tutorial. To make sure you have set up your app correctly, you can get an example of such an app from the Alan GitHub.

Here’s what you can do next

Now, you can proceed with building a voice interface with Alan. Look into some helpful resources listed below:

--

--

Alan AI

Alan is a B2B Voice AI platform for developers to deploy and manage Voice Interfaces for Enterprise Apps.