Search
⌃K
Links

CONVA Search

Adding an accurate, multilingual Voice Search capability into your app
By now you must have configured and published your Assistant via the Slang Console and also maybe customized it as required. Congratulations! :) If you have not already done that, you can do so by following the instructions here.
Note that you need to ensure that the "Voice Search" user journey is enabled for your Assistant.
Let's start coding!
For testing, we recommend using a physical Android device instead of an emulator because most emulators don't work well with microphones.

1. Configure the build system

The first step is to update the app's build system to include Slang's Retail Assistant SDK.
Android Native
iOS
React Native
Flutter
Web

Add the Slang dependency to your Gradle files

Add the path to the Slang maven repository to your top-level Gradle file
# Add this to your top level Gradle file
allprojects {
repositories {
maven { url "https://gitlab.com/api/v4/projects/25706948/packages/maven" }
}
}
Add the Slang Retails Assistant dependency to your app's Gradle file
# Add this to your app's Gradle file
dependencies {
implementation 'in.slanglabs.assistants:slang-retail-assistant:8.2.7'
}

Setting up via Cocoapods

Add the path to the Slang Cocoapod repository to your Podfile
# Add this to your podfile
source 'https://github.com/SlangLabs/cocoapod-specs'
source 'https://github.com/CocoaPods/Specs.git'
Add the dependency to SlangRetailAssistant in your Podfile
pod 'SlangRetailAssistant'
Add support for granting microphone permission In iOS, the user must explicitly grant permission for an app to access the user’s data and resources. An app with the SlangRetailAssistant requires access to User’s device microphone for voice interactions.
To comply with this requirement, you must add NSMicrophoneUsageDescription key to the Info.plist file of your app and provide a message about why your app requires access to the microphone. The message will be displayed only when the SlangRetailAssistant needs to activate the microphone.
To add the key:
  1. 1.
    In the Xcode project, go to the Info tab.
  2. 2.
    In the Custom iOS Target Properties section, hover over any key in the list and click the plus icon to the right.
  3. 3.
    From the list, select Privacy - Microphone Usage Description.
  4. 4.
    In the Value field to the right, provide a description for the added key. This description will be displayed to the user when the app is launched. For example: "We require microphone permission to enable the voice assistant platform"

Install the Slang Retail Assistant package

The next step is to install the required packages inside your code repo

yarn setup

If you use yarn for install packages, run the below command
yarn add @slanglabs/slang-conva-react-native-retail-assistant

npm setup

If you use npm for managing your packages, run the below command
npm install @slanglabs/slang-conva-react-native-retail-assistant --save
Because Slang uses native libraries, you need to link the package to your codebase to run the automatic linking steps
react-native link @slanglabs/slang-conva-react-native-retail-assistant
Try out the playground app for developers to understand the assistant.

Install the Slang Retail Assistant package

Run the below command to install the required packages inside your code repo.
$ flutter pub add slang_retail_assistant
Once done, run the command 'dart pub get' and ensure Slang assistant is added to the dependencies.
dependencies:
slang_retail_assistant: ^8.0.0

Next is to import Slang Retail Assistant in your dart code.

import 'package:slang_retail_assistant/slang_retail_assistant.dart';

Install the Slang Retail Assistant package

The next step is to install the required packages inside your code repo

yarn setup

If you use yarn for install packages, run the below command
$ yarn add @slanglabs/[email protected]

npm setup

If you use npm for managing your packages, run the below command
$ npm install @slanglabs/[email protected] --save

2. Code integration

2.1 Initialization

The next step is to initialize the SDK with the keys you obtained after creating the Assistant in the Slang console.
Android Native
iOS
React Native
Flutter
Web
The recommendation is to perform the initialization in the onCreate method of the main Activity where the search bar will be visible.
// Your main activity class
protected void onCreate(Bundle savedInstance) {
...
AssistantConfiguration configuration = new AssistantConfiguration.Builder()
.setAPIKey(<API Key>)
.setAssistantId(<AssistantId>)
.build();
SlangRetailAssistant.initialize(this, configuration);
}
The recommendation is to perform the initialization in the viewDidLoad method of the main ViewController where the search bar will be visible.
import SlangRetailAssistant
override func viewDidLoad() {
super.viewDidLoad()
...
let config = AssistantConfiguration.Builder()
.setAPIKey("<APIKey>")
.setAssistantId("<AssistantId")
.build()
SlangRetailAssistant.initialize(with: config)
}
This should ideally be done in the componentDidMount of your main app component
import SlangConvaTrigger, {SlangRetailAssistant} from '@slanglabs/slang-conva-react-native-retail-assistant';
SlangRetailAssistant.initialize({
requestedLocales: ['en-IN', 'hi-IN'], // The languages to enable
assistantId: '<assistant id>', // The Assistant ID from the console
apiKey: '<API Key>', // The API key from the console
})
This should ideally be done inside the main method.
import 'package:slang_retail_assistant/slang_retail_assistant.dart';
var assistantConfig = new AssistantConfiguration()
..assistantId = "<AssistantId>"
..apiKey = "<APIKey>"
..requestedLocales = ['en-IN', 'hi-IN'];
SlangRetailAssistant.initialize(assistantConfig);
import SlangRetailAssistant from '@slanglabs/slang-retail-assistant';
await SlangRetailAssistant.init({
requestedLocales: ['en-IN', 'hi-IN'], // The languages to enable
assistantID: '<assistant id>', // The Assistant ID from the console
apiKey: '<API Key>', // The API key from the console
})

2.2 Show Slang CONVA Trigger

Trigger refers to the UI element that will appear on the screen, which the end user will click to bring up the Assistant. Slang CONVA comes with two trigger styles.
  • Inline Trigger - Insert the trigger in a custom location in your application, usually next to your search bar
  • Global Trigger - Insert the trigger as a FAB-like button, which will automatically appear on all screens of your application.
Note that for now, you can enable only one of the above trigger types for an application

Inline Trigger

This is the most common way to add Voice Search into apps, ie adding the trigger (which by default is a microphone icon) either inside or next to the search bar.
Inline Trigger
Android Native
iOS
React Native
Flutter
Web
Add the below XML element to your UI definition in the place where you want the trigger (the default image is a microphone icon) to appear (usually next to the search bar).
<in.slanglabs.assistants.retail.SlangConvaTrigger
android:layout_width="45dp"
android:layout_height="45dp"
/>
Currently, there is no ready-made iOS UI element for adding the inline trigger. This will be added shortly. But meanwhile, developers can add a mic button on their own and perform the following :
  1. 1.
    Enable custom trigger config on the AssistantConfiguration options
//Add the following config option to the existing AssitantConfiguration
AssistantConfiguration configuration = new AssistantConfiguration.Builder()
.setAPIKey(<API Key>)
.setAssistantId(<AssistantId>)
.enableCustomTrigger(true) //Enable this particular option
.build();
  1. 2.
    Call the startConversationAPI when the mic button is clicked.
SlangRetailAssistant.startConversation(.search, self)
Add the below element to your UI definition in the place where you want the trigger (the default image is a microphone icon) to appear (usually next to the search bar).
const styles = StyleSheet.create({
triggerStyle: {
width: 45,
height: 45,
},
});
<SlangConvaTrigger
style={styles.triggerStyle}
/>
To specify a custom image for the microphone icon, please specify the src field in the trigger element.
src: A string representing the remote URL of the image.
Example :
<SlangConvaTrigger
src={{uri: Image.resolveAssetSource(require('../Images/trigger.png')).uri}}
style={styles.triggerStyle}
/>
Currently, there is no ready-made Flutter UI element for adding the inline trigger. This will be added shortly. But meanwhile, developers can add a mic button on their own and use the below API to trigger CONVA programmatically as mentioned here
Flutter
Currently, there is no ready-made Web UI element for adding the inline trigger. This will be added shortly. But meanwhile, developers can add a mic button on their own and perform the following:
  1. 1.
    Enable custom trigger config on the AssistantConfiguration options
// Enable the custom trigger using the following API
SlangRetailAssistant.getUI().enableCustomTrigger(true);
  1. 2.
    Call the startConversationAPI when the mic button is clicked.
// Default UserJourney is available through SlangRetailAssistant.AssistantUserJourneys.<UserJourneyName>
// Default Subdomain is available through SlangRetailAssistant.AssistantSubdomains.<SubdomainName>
SlangRetailAssistant.startConversation(<Default UserJourney, Default Subdomain>)

Global Trigger

The trigger can also be enabled in a global fashion, ie where the trigger is automatically visible across all screens as a floating FAB-like button. The button can be moved around and will show up automatically on all screens.
Global Trigger
To make the trigger visible, you need to call the "show" method in the appropriate place for your platform
Android Native
iOS
React Native
Flutter
Web
Add the below line to the onResume method of the Activities where you want the Assistant to be enabled.
protected void onResume(Bundle savedInstance) {
...
SlangRetailAssistant.getUI().showTrigger(this); // There is a corresponding hideTrigger too if needed
}
Add the below line to the onViewDidAppear method of the ViewControllers where you want the Assistant to be enabled.
override func viewDidAppear(_ animated: Bool) {
super.viewDidAppear(animated)
...
SlangRetailAssistant.getUI().showTrigger(in: self)
}
SlangRetailAssistant.ui.showTrigger(); // There is a corresponding hideTrigger too if needed
SlangRetailAssistant.getUI().showTrigger();
SlangRetailAssistant.getUI().showTrigger(); // There is a corresponding hide too if needed
By default, the trigger is sticky, meaning it will appear on all Activities after it is made visible. To prevent the trigger from showing up on specific activities, you can call:
Android Native
iOS
React Native
Flutter
Web
SlangRetailAssistant.getUI().hideTrigger(this)
SlangRetailAssistant.getUI().hideTrigger(in: self)
SlangRetailAssistant.ui.hideTrigger(); // There is a corresponding hideTrigger too if needed
SlangRetailAssistant.getUI().hideTrigger();
SlangRetailAssistant.getUI().hideTrigger(); // There is a corresponding hide too if needed
Note that by default CONVA will show a coach-mark on the trigger when it shows up for the first time on the screen. Refer to the API below for details to control it
Once the app has integrated the trigger, the next step is to register a callback to handle the voice search commands from the end user. CONVA will process the utterance spoken by the end user and if it detects that the user is trying to do a valid search operation, it will invoke the registered callback with the search string. Note that the search string will always be in English irrespective of which language the end user spoke it in. CONVA will automatically translate other language inputs into English.
Android Native
iOS
React Native
Flutter
Web
Android (Deprecated)
SlangRetailAssistant.setOnSearchListener(
new OnSearchListener() {
@Override
public void onSearch(SearchInfo searchInfo, final SearchUserJourney searchUserJourney) {
String searchString = searchInfo.getSearchString();
// Fire the actual search using the searchString
}
}
);
SlangRetailAssistant.onSearch = { (searchInfo, searchUserJourney) in
let searchString = searchInfo.searchString
// Fire the actual search using the searchString
...
}
SlangRetailAssistant.setAction({
onSearch: (searchInfo, searchUserJourney) => {
String searchString = searchInfo["description"]
// Fire the actual search using the searchString
}
});
//Set the action handler via the setAction method
class AppAction implements RetailAssistantAction {
@override
void initState() {
super.initState();
SlangRetailAssistant.setAction(this);
}
@override
void onAssistantError(Map<String, String> assistantError) {
// Handle errors that might have occurred during the Assistant lifecycle
}
@override
SearchAppState onSearch(SearchInfo searchInfo, SearchUserJourney searchUserJourney) {
// Handle search requests
// ...
return new SearchResultAppState(SUCCESS)
}
}
var action = new AppAction();
SlangRetailAssistant.setAction(action);
const actionHandler = {
onSearch: (searchInfo, searchUserJourney) => {
// Handle the search request
// ...
searchUserJourney.setSuccess();
return SearchUserJourney.AppStates.SEARCH_RESULTS;
},
as onManageOrder: (orderInfo, orderManagementUserJourney) => {
// Handle the order request
// ...
orderManagementUserJourney.setViewSuccess();
return OrderManagementUserJourney.AppStates.VIEW_ORDER;
},
onNavigation: (navigationInfo, navigationUserJourney) => {
// Handle the navigation request
// ...
navigationUserJourney.setNavigationSuccess();
return NavigationUserJourney.AppStates.NAVIGATION;
},
};
SlangRetailAssistant.setAction(actionHandler)
SlangRetailAssistant.setAction(new SlangRetailAssistant.Action() {
@Override
public SearchUserJourney.AppState onSearch(SearchInfo searchInfo, SearchUserJourney searchJourney) {
// Handle search requests
// ...
userJourney.setSuccess();
return SearchUserJourney.AppState.SEARCH_RESULTS;
}
public NavigationUserJourney.AppState onNavigation(NavigationInfo navInfo, NavigationUserJourney navJourney) {
// Handle navigation requests
// ...
userJourney.setNavigationSuccess();
return NavigationUserJourney.AppState.NAVIGATION;
}
public OrderManagementUserJourney.AppState onOrderManagement(OrderInfo orderInfo, OrderManagementUserJourney orderourney) {
// Handle order management requests
// ...
userJourney.setViewSuccess();
return OrderManagementUserJourney.AppState.VIEW_ORDER
}
@Override
public void onAssistantError(final AssistantError error) {
// Handle errors that might have occurred during the Assistant lifecycle
// Error codes available
// FATAL_ERROR, SYSTEM_ERROR, ASSISTANT_DISABLED, INVALID_CREDENTIALS,
}
}

2.4 Specifying Custom Events for Deeper Analytics

As part of the Analytics offering from CONVA, it offers a few metrics that can be used to track the effectiveness of adding the Voice Search experience.
  • VTS - Voice-to-text - tracks the affinity of users to do voice or text searches for any search session
  • CTR - Clickthrough Rate - allows the app to compare the click-throughs that happened on the search results page when the user landed on it via text or voice search
In order to compute these metrics, the app is expected to share a couple of events at the appropriate time. Let's look into each of those required events

2.4.1 Text Search Events

CONVA provides an API with which the app can specify if a search happened via UI/Text based interactions. This event would be used to determine the adoption and engagement of the voice compared to the traditional UI-based interactions.
Android Native
iOS
React Native
Web
SlangRetailAssistant.notifyTextSearch("<SearchString>");
SlangRetailAssistant.notifyTextSearch("<SearchString>")
SlangRetailAssistant.notifyTextSearch("<SearchString>");
// Under Development

2.4.2 CTR Events

Use the below API to inform CONVA when a user clicked on the search results. Note that the user could have clicked on different sections of the search result - wishlist, product, add-to-cart. Providing the exact section that was clicked will allow CONVA's Analytics to provide more fine-grained comparisons between voice and text searches.
Android Native
iOS
React Native
SlangRetailAssistant.notifyCTREvent(
new HashMap<String, String>(){
{
// The section that was clicked. Here are the
// supported strings
// "NavigatedToProductPage"
// "AddedToCart"
// "AddedToWishlist"
// "ShadeSelected"
put("eventName","<SectionName>");
// The product item that was clicked
// Eg: "Organic Onions"
put("itemName","<ItemName");
}
});
SlangRetailAssistant.notifyCTREvent(
eventMedata: ["eventName": "<SectionName">, "itemName": "<ItemName>"]
)
//SectionName include
// "NavigatedToProductPage"
// "AddedToCart"
// "AddedToWishlist"
// "ShadeSelected"
//ItemName specifies the item that was clicked
// Eg: "Organic Onion"
var eventInfo = {
eventName: "<SectionName>", // The section that was clicked. Here are the
// supported strings
// "NavigatedToProductPage"
// "AddedToCart"
// "AddedToWishlist"
// "ShadeSelected"
itemName: "<ItemName>". // The product item that was clicked
// Eg: "Organic Onions"
}
SlangRetailAssistant.notifyCTREvent(eventInfo)

2.4.3 Associating App specific UserIds to the CONVA Analytics Events

By default CONVA generated analytics events do not capture any PII data about the user. We use a CONVA generated DeviceId to uniquely identify the device on which the app instance is running. But if an app wants CONVA to associate these events with any user ids that it already knows about the current user (say a logged in user), it can use the below API to trigger that association.
Android Native
iOS
React Native
Web
SlangRetailAssistant.setUserId("<UserId>");
SlangRetailAssistant.setUserId("<UserId>")
SlangRetailAssistant.setUserId("<UserId>");
// Under Development

2.5 Assistant Prompts

Based on the how the Assistant was configured in the Console returned and the conditions that were set, the Assistant will speak out an appropriate message to the user.
The prompts spoken by the Assistant are customizable. Refer to the Customizing the Assistant section, if you're interested in customization.
That's it! These are the basic set of steps required to add Slang's In-App Voice Assistant to your app.
Beyond this integration, Slang Voice Assistants provide a lot more power and flexibility to cater to the more advanced needs of the apps. Please refer to the Advanced Concepts section for more details.

2.6 Nudging Users

We also offer a nudge API to trigger the coach mark on the CONVA trigger on demand.
NOTE: This will only work with the Global and Inline Trigger that is managed by the SlangAssistant SDK. It will not work in the case of custom app Triggers/Mic Buttons.

2.6.1 Nudging with messages controlled remotely

The CONVA console allows specifying the message to be shown in the nudges. So to dynamically show the coachmark with the messages configured in the console, use the below APIs
Android Native
iOS
React Native
Flutter
Web
SlangRetailAssistant.getUI().nudgeUser();
SlangRetailAssistant.getUI().nudgeUser()
SlangRetailAssistant.ui.nudgeUser();
// TBD
// Under Development

2.6.2 Nudging with custom runtime messages

Sometimes it would be useful to show more contextual messages to the user to educate them about the Voice capability in the app. Eg after the user does a text search, the app can show a coachmark pointing to the mic trigger and informing the user to try voice next time.
To perform such contextual nudges, use the below API
The user can pass language specific prompts and CONVA will pick the right language based on the currently selected locale.
There are two strings that can be specified.
  • The title string - the one that shows up in bold in the first line
  • The description string - the one that shows up in regular style in the second line
Android Native
iOS
React Native
Flutter
Web
SlangRetailAssistant.getUI().nudgeUser(sActivity,
new HashMap < Locale, String > () {
{
put(SlangLocale.LOCALE_ENGLISH_IN, "Title");
put(SlangLocale.LOCALE_HINDI_IN, "शीर्षक");
}
},
new HashMap < Locale, String > () {
{
put(SlangLocale.LOCALE_ENGLISH_IN, "Description");
put(SlangLocale.LOCALE_HINDI_IN, "विवरण");
}
}
);
SlangRetailAssistant.getUI().nudgeUser(in: self,
title: [
Locale(identifier: "en-IN"): "Title",
Locale(identifier: "hi-IN"): "शीर्षक"
],
description: [
Locale(identifier: "en-IN"): "Description",
Locale(identifier: "hi-IN"): "विवरण"
]
)
var title = {
"en-IN": "title",
"hi-IN": "शीर्षक"
}
var description = {
"en-IN": "description",
"hi-IN": "विवरण"
}
SlangRetailAssistant.ui.nudgeUserWithParameters(title, description);
// TBD
// Under Development