Voice User Experience Patterns for existing Application UI: Alan Blog
Alan’s Voice AI Platform allows anyone to enhance their existing application UI with a complete Voice User Experience. Industries including Energy, Manufacturing, Healthcare, Aerospace, and many more are using Alan to add hands-free voice experiences to their applications, and we expect many more to follow. We put together our findings based on numerous use cases and wanted to share the different user-flow-patterns most common for Voice Experiences in existing Enterprise applications.
👀 What are Patterns?
In this case, Patterns refer to items in the app and the actions a user does with them.
The Alan team works closely with teams from various enterprises to create voice scripts, then integrate them into their apps and design the main voice conversational flows. We noticed similar ways of interacting with the Apps and conversational voice flows. Moreover, UI/UX designers & developers followed best practices in using and implementing patterns, although their definition of patterns is distinct from ours and even one another.
These patterns appear because Apps have similar flows like working with list items, creating, editing, getting notifications; and because a developer’s tools for creating user interfaces are limited.
This is how we define our “Pattern”:
✅ A Pattern is: ❌ A Pattern is not:
🧐 Voice User Experience Patterns for App UI
People generally expect the following standard Patterns to work the same across apps. Below we’ve laid out some examples of patterns and descriptions of the Voice UX. With Alan in the existing application UI, users can touch and type in addition to using the Voice Experience.
Series of items: words, numerals, cards, links, written together in a meaningful grouping or sequence.
User: What documents do I have for today?
Alan: You have Technical documentation, Project Share and Report a power outage. Do you want to go over the first one?
User: What documents do I have for today? Alan: You have Technical documentation, Project Share and Report a power outage. Do you want to go over the first one?
[Alan reads a list of items, scrolls, highlights item, filters them by request.]
Also, all with voice:
- Alan selects and navigates items
- Makes group actions
- Expands list of lists and expands items
- Picks several items from different sets
- Responds back with the status of several items.
A page (screen) usually with static content: text, images, videos, or audio. Maybe an element of a list or main page. Includes all elements on the screen, but typically we mean the body or the main part of the screen.
User: When will my next refill be delivered?Alan: Lisinopril will be delivered on Wednesday, January 15th at 2PM to your address. For free.
[reads answers from screen]
Also, all with voice:
- May search within a screen, opens what is attached.
- Provides additional options
- Asks to setup extra features that may be related, but not on screen now.
This is a combination of content, list, and form. It may include metadata, editing, creating and deleting options. The task is also stored between others within a higher structure of different items (Imagine Jira task in a huge Jira World).
User: What's the status of Inline inspection? Alan: In progressUser: Change status to Resolved Alan: Inline inspection status was changed to Resolved
- Can be edited
- Changed when opened (task opened in full screen) or closed (not on screen)
- Can be moved between others, or be put in a different structure tree within the whole system
- Have metadata like tags, statuses
Examples: work orders, delivery orders, requests to support.
A screen with open or editable fields, inputs, and other controls.
User: Display audit for 2L of Pepsi, Mountain Dew, 7Up. Alan: Review. Any comments?User: Sale was great...products got sold out and replace by 15th...December 15th
A user might ask to fill, submit, change data in fields, set up any number of controls with just a simple phrase.
Showing a specific screen, tab, section.
User: Submit and go to homescreen Alan: Submitted and navigated to the homescreen
Users have certain habits and want less cognitive stress when looking at a particular screen, tab or folder. They also need to have a feeling of how they got there. So we can show navigation when the user decides to go back or forth.
Also when we think of navigation, we mean searching and picking items from any similar or different sets.
All notifications to the user from the system or other users. In any format, frequency, and type (errors, updates, emails, emergencies, etc)
[popup on the screen]
Alan: Your work order has been changed due to a storm. We recieved weather update. Would you like to know more about the new work order?
All actions with screens where users may zoom in & out, layers may have lists and content items depending on zoom level (or not).
User: What status do closest units have?Alan: First unit has "in progress", second is free now, third is offline.
Quiz or questionnaire
A list or a sequence of questions, consisting of two parts:
- list of questions that are shown upfront or hidden.
- screen or block with Q & A.
Alan: How would you describe your experience in our app?User: It's good
The user selects a question to answer or Alan asks it. Users may navigate questions and give answers according to the setup.
List, Form, Content, Navigation, and Task in single Consumer use case
In this example of a consumer app (Pizza delivery), several patterns are involved
We are at the cusp of something very exciting — with more ground to cover in Voice UX for existing Application UI. To continue this journey, the first step is to get familiar with these patterns and see how they can apply to your own applications or applications you use every day. We have all of these patterns available today for all users today in Alan Studio to implement in applications. Get started here.
Originally published at https://alan.app on February 10, 2020.