Add AI Voice Assistant to Android Apps

Alan AI
11 min readFeb 4, 2022

Make your app more user-friendly and accessible by adding Voice UX. In this article, you’ll learn how to add voice commands and implementing AI assistant capabilities in an Android application.

What is Voice Experience?

Have you been developing applications for a long time or just started exploring them? Then you probably came by the terms User Interface and User Experience popularly known as UI/UX. User Interface deals with the visual interface elements such as colors, layouts, fonts, and more through which the user interacts with the application. But have you ever wondered of a hands-free application where you could interact with voice? It sounds amazing!

It not only sounds very cool but in a real-life scenario, this might be the solution that many folks need. But how do we add a voice into our apps? Adding a voice experience might involve various skills like machine learning, Speech Recognition software, Natural Language Processing, and many more.

With Alan, anyone can integrate a complete voice interface into their applications without all the complexity. The Alan Platform automates all of the above processes with its cloud-based infrastructure and lets developers focus more on implementing what they want, to build an amazing voice experience.

How do you add a Voice Experience to your android app?

While Alan supports numerous frameworks on web and mobile platforms, let’s learn how to add Alan to android applications using Alan Android SDK. In this tutorial let’s build a Todo app for android smartphones, add a Voice Experience using Alan, and test it on a device. We’ll also learn how to make use of some advanced Alan functionalities such as server functions that provide an extra security and abstraction layer to our app.

What will you learn?

  • How to get started with Alan Platform
  • How to add a voice assistant to our Todo android app
  • Advanced Alan functionalities

What are the prerequisites?

  • Stable internet connection
  • Android Studio
  • Knowledge in Java, Android framework, and basic JavaScript

So let’s get started without further delay.

Step 1: Create a new Android Studio Project

  • Open Android Studio and create a new project. Select Basic Activity as it comes with a Floating Action Button. You can also select a different activity if you want a different layout for your home screen.
Create a new project with Basic Activity
  • Name your project, select a location to save your project, and select the min SDK for your android app (API 26 or above is recommended). Click on Finish.
Naming the Project
  • A code template for Basic Activity is provided by default. Delete or modify the unnecessary files in the project. In order to make the application effective, the entire application must run on a single activity with multiple fragments as per the requirement.

A sample code template after making the essential modifications is provided here. Refer to README file for a clear understanding of code and how to clone the repository.

In this project we will use View binding, which replaces the traditional method of creating objects for different components in the layout files using findViewByID() and provides a compact way to directly access the components and use in our code.

After making essential changes, your MainActivity.java will look like this:

MainActivity.java

Step 2: Sign up on Alan Studio

  • Open your browser and head over to Alan Studio and sign up. If you already have an account, sign in instead.
  • Create an Empty project. Name your Todo project
    Note: Android Studio and Alan Studio project names need not to be same
Create an empty project
  • Alan comes with a lot of predefined scripts which helps you add some of the most popular functionalities with a few clicks. Click on Add Script and add as many scripts you want.
Add predefined scripts

Step 3: Add Alan dependency to your Todo app

  • Click on Integrations button in Alan Studio for the instructions to add Alan to your project. Under Android tab, a list of options are provided for customization. A sample code template is also provided there.
  • Copy the implementation and paste it in module level build.gradle file in Android Studio and click on Sync now.
dependencies {
...
// Add Alan dependency
implementation 'app.alan:sdk:4.12.0'
}
Module level build.gradle
  • Add Alan Button to activity_main.xml as shown below. In this code snippet Coordinator Layout is used. A few attributes may differ from user to user depending on the parent layout used and other components in the layout.
<com.alan.alansdk.button.AlanButton
android:id="@+id/alan_button"
android:layout_width="wrap_content"
android:layout_height="wrap_content"
android:layout_gravity="bottom|end"
android:layout_marginEnd="5dp"
android:layout_marginBottom="70dp"
android:visibility="visible"
app:button_horizontal_align="right" />
activity_main.xml
  • Now lets add the logic to activate Alan Button in our app. Go to MainActivity.java and add the following script to connect to your Alan project. Find your SDK Key in Alan Studio Integrations page and paste it in place of “YOUR SDK KEY”.
public class MainActivity extends AppCompatActivity {
...
private ActivityMainBinding binding;
private static final String SDK_KEY = "YOUR SDK KEY"; @Override
protected void onCreate(Bundle savedInstanceState) {
... AlanButton alanButton = binding.alanButton;
AlanConfig config = AlanConfig.builder()
.setProjectId(SDK_KEY).build();
alanButton.initWithConfig(config);
}
}

Alright, lets test what we built so far. Build the project and launch the application either on an emulator or your smartphone. Tap the Alan button and say “Hello world”.

Hello world

Step 4: Add custom voice commands

Your application can now respond to various questions through voice depending on the scripts you added earlier in Alan Studio. But that’s not it, what if you want Alan to respond to a particular question in a desired way?

Lets add some custom voice commands. Head over to Alan Studio and open the default script AlanTodo. Add your question and response for the same.

intent('What can you do?', p => {
p.play('I can add a task and read them for you');
});intent('Say hurrah', p => {
p.play('Hurrah!');
});

Click on save changes. Now, click on the Alan Button again and ask What can you do?

Custom voice command

Add as many commands as you want and test them out. If you’re a beginner and having a hard time understanding Alan scripts, head over to our free Udemy course where you can learn all the fundamental concepts within an hour and get certified.

Step 5: Add core logic to your app

Now its time to show your creativity and build the Todo app as you desire. Add item layouts, home and login screens, connect to various backend services, add authentication to account for users and many more.

“Contain as much code in respective methods so that it gets easier to add voice in later steps”

Note that, Alan Button is bound to an Activity. So, every time you make a new Activity where you want Alan Button to work, you have to add Alan Button to its layout and initialize the button again. Also, you need to disable the Alan Button from the previous activity before starting a new one. Therefore, it is recommended to work on a single activity throughout and use fragments instead.

You can refer to our award winning project Todogenix for layouts and other resources. You can also find code to implement interfaces which enables communication between any Java files. Learn more about Todogenix here.

Step 6: Add Voice navigation to your app

Great! Your Todo app now has a great UI and works awesome. Let’s now add some voice commands that can navigate through different screens, perform actions such as clicking buttons, highlight elements and many more.

Communication between our Android app and Alan Studio happens through API calls where JSON Objects are exchanged. When a user asks Alan a question, the query is sent to Alan Studio and a JSON Object is returned which contains all the data associated to the query.

Add Navigation commands in Alan Studio

Adding navigation commands to Alan is very similar to adding normal voice commands. We just need to an extra line which contains the JSON Object that is sent to our app.

//In Alan console
intent('(Go|take me) back', p => {
p.play('Sure');
p.play({ commandName: "go_back" });
});intent('Add a todo task (named|with title|) $(TASK_TITLE* (.+))', p => {
p.play('Adding the task');
p.play({ commandName: "add_todo", title: p.TASK_TITLE.value });
});

This is how the console looks after adding the above code:

Alan Studio console

Catch the command on your client

When a specific command in Alan is triggered it sends a JSON Object back to our client. Here’s how to receive and decode the JSON Object in Android Studio. After initializing the Alan Button, add these lines

AlanCallback alanCallback = new AlanCallback() {
/// Handle commands from Alan Studio
@Override
public void onCommand(final EventCommand eventCommand) {
try {
JSONObject command = eventCommand.getData();
1 //data object is the JSON Object returned in p.play() method
JSONObject data = command.getJSONObject("data");
String commandName = data.getString("command");
//based on commandName we can perform different tasks
executeCommand(commandName, data);
} catch (JSONException e) {
Log.e("AlanButton", e.getMessage());
}
}
};
alanButton.registerCallback(alanCallback);

Alan callback object keeps listening to Alan Studio calls with the help of handlers. Whenever a command is triggered in Alan Studio data is sent to our app and onCommand method is executed where data is decoded. We can create a function named executeCommand where we can add logic depending on type of data we receive.

private void executeCommand(String commandName, JSONObject data) {
if (commandName.equals("go_back")) {
onBackPressed(); //android lifecycle method
}
if (commandName.equals("log_out")) {
logOut();
}
if (commandName.equals("add_todo")) {
try {
String title = data.getString("title");
addTodoItem(title);
} catch (JSONException e) {
Log.e("AlanButton", e.getMessage());
alanButton.playText("I'm sorry I'm unable to do this at the moment");
}
}
}private void logOut() {
//code to log out
}private void addTodoItem(String title) {
//code to add a Todo Item
}
MainActivity.java

This is how we receive data from Alan and perform the corresponding task in our application. Make as many intents in Alan script and the corresponding functions in Android Studio to achieve a hands-free application.

What else can you do with Alan?

Client API Methods

There are lot more concepts in Alan AI apart from just making intents and getting them in our app, that are discussed in our Udemy course and documentation. Also, on the client side, we can make use of Client API methods to enable communication between our android app and Alan and trigger voice activities on client side.

  • One of the most useful methods is callProjectApi() method. This method can be used to store server functions in Alan Studio and execute them on the client side with an API call. This provides an extra layer of abstraction and security to our code. Let’s look at how we can implement it.

Create a function named greetUser in Alan Script that accepts user’s name as parameter that greets user through voice.

//Alan script
projectAPI.greetUser = function(p, param, callback) {
p.userData.nameOfUser = param.userName;
p.play(`Hey ${p.userData.nameOfUser}, this is Alan. The voice assistant for Todo app. Here, you can store your daily tasks and prioritize them`);
};

How do we trigger this method on our client? First, you need to activate the Alan Button using another client API method activate(). Then, pass all the required arguments in the form of JSON Object as shown below:

public void greetUser(String name) {
binding.alanButton.activate();//parameters to be passed
JSONObject object = new JSONObject();
try {
object.put("userName", name);
binding.alanButton.callProjectApi("greetUser", object.toString());
} catch (JSONException e) {
e.printStackTrace();
}
}
  • playText() method can be used to play text at any point of execution from client side using Alan Button object.
private void speakText(String text, AlanButton alanButton) {
//if the function is written outside MainActivity.java
//pass alanButton in parameter as shown and instead write
//alanButton.playText(text); binding.alanButton.playText(text);
}

Explore more about client API methods here.

Alan Handlers

Handlers can be used to build a voice user experience. There are three handlers available in Alan.

  • onCommand handler: We already used this handler in the above steps to listen to different intents and get data from Alan. It handles commands sent from the voice scripts to the app.
  • onButtonState handler: This handler is used to get different states of Alan Button in your app. It can be helpful when you want to execute a set of tasks based on the button state. For example, you want to greet the user when the button is connected.
  • onEvent handler: As we discussed earlier that all the communication between the client and Alan happens through JSON Objects. While Alan interacts with the user and interprets the user’s input, it emits a series of events. You can intercept the event information and use it in your app logic if needed.
Handler Methods

Explore more about Alan Handlers here.

Test the Todo app

Now that we have learned how to build our application, it’s time we test it on our device and find bugs. The end product might vary from user depending on what elements they’ve added in Step 5 and how the functions are being executed.

However, we provide you a sample project on GitHub for a clear understanding which is built upon the template that was earlier mentioned. You are free to download the code, modify according to your purpose and build your own version. Follow the steps provided in README file to clone and run the project on your device.

Conclusion

Woah! Woah! That’s a lot of stuff to grasp at once. Let’s have a quick recap of what we learned so far.

We learned how Voice can elevate the User Experience and take our app to next level and how Alan helps us add voice without any knowledge in Artificial Intelligence concepts. Then we learned to add Alan to our android application in various steps. We also got some valuable tips on the way which make our application efficient and our code easy to debug and how we can elevate User Voice Experience with some advanced Alan concepts.

The entire application code is available on GitHub. Feel free to make use of Alan and Android documentation in case of any queries. Create an issue on GitHub if you spot a bug or want to make an improvement to the existing code. For further assistance, code reviews or to showcase your project, join our Slack community.

Link to the getting started Alan AI To-Do template-
https://github.com/venusaim23/Alan-Todo-Template

Happy Hacking :)

Author, Venu Sai

Don’t forget to give us your 👏 !

--

--

Alan AI

Alan is a B2B Voice AI platform for developers to deploy and manage Voice Interfaces for Enterprise Apps.