Code Integration - Basic Steps

Integrating the Slang Voice Assistant with your app

By now you must have configured and published your Assistant via the Slang Console. Congratulations! :) If you have not already done that, you can do so by following the instructions here.

While the overall idea is similar across platforms, the specific steps involved vary slightly based on the platform on which your app is built. Supported platforms are:

  • Android Native

  • React Native for Android

  • Web (JS)

Let's start coding!

For testing, we recommend using a physical Android device instead of an emulator because most emulators don't work well with microphones.

1. Configure the build system

The first step is to update the app's build system to include Slang's Retail Assistant SDK.

Java
React Native
Web
Java

Add the Slang dependency to your gradle files

Add the path to the Slang maven repository to your top-level gradle file

# Add this to your top level gradle file
allprojects {
repositories {
maven { url "http://maven.slanglabs.in:8080/artifactory/gradle-release" }
}
}

Add the Slang Retails Assistant dependency to your app's gradle file

# Add this to your app's gradle file
dependencies {
implementation 'in.slanglabs.assistants:slang-retail-assistant:4.0.1'
}
React Native

Install the Slang Retail Assistant package

The next step is to install the required packages inside your code repo

yarn setup

If you use yarn for install packages, run the below command

$ yarn add @slanglabs/slang-conva-react-native-retail-assistant

npm setup

If you use npm for managing your packages, run the below command

$ npm install @slanglabs/slang-conva-react-native-retail-assistant --save

Because Slang uses native libraries, you need to link the package to your codebase to run the automatic linking steps

$ react-native link @slanglabs/slang-conva-react-native-retail-assistant

Finally, add the path to the Slang maven repository (to download native library dependencies) to your top-level gradle file

# Add this to your top level gradle file
allprojects {
repositories {
maven { url "http://maven.slanglabs.in:8080/artifactory/gradle-release" }
}
}
Web

Install the Slang Retail Assistant package

The next step is to install the required packages inside your code repo

yarn setup

If you use yarn for install packages, run the below command

$ yarn add @slanglabs/[email protected]

npm setup

If you use npm for managing your packages, run the below command

$ npm install @slanglabs/[email protected] --save

2. Code integration

2.1 Initialization

The next step is to initialize the SDK with the keys you obtained after creating the Assistant in the Slang console.

Java
React Native
Web
Java

The recommendation is to perform the initialization in the onCreate method of the Application class. If the app does not use an Application class, the next best place would be the onCreate method of the primary Activity class.

// Your application class
protected void onCreate(Bundle savedInstance) {
...
AssistantConfiguration configuration = new AssistantConfiguration.Builder()
.setAPIKey(<API Key>)
.setAssistantId(<AssistantId>)
.setEnvironment(STAGING) // Change this to PRODUCTION once you've published the Assistant to production environment
.build();
SlangRetailAssistant.initialize(this, configuration);
}
React Native

This should ideally be done in the componentDidMount of your main app component

import SlangRetailAssistant from '@slanglabs/react-native-slang-retail-assistant';
SlangRetailAssistant.initialize({
requestedLocales: ['en-IN', 'hi-IN'], // The languages to enable
assistantId: '<assistant id>', // The Assistant ID from the console
apiKey: '<API Key>', // The API key from the console
})
Web

import SlangRetailAssistant from '@slanglabs/slang-retail-assistant';
SlangRetailAssistant.initialize({
requestedLocales: ['en-IN', 'hi-IN'], // The languages to enable
assistantID: '<assistant id>', // The Assistant ID from the console
apiKey: '<API Key>', // The API key from the console
})

2.2 Show the Trigger (microphone icon)

Once the Assistant is initialized, the next step is to show the microphone UI element (what we call the Trigger) that the app's users can click on to invoke the Assistant and speak to it.

Android Native
React Native
Web
Android Native

Add the below line to the onResume method of the Activities where you want the Assistant to be enabled.

protected void onResume(Bundle savedInstance) {
...
SlangRetailAssistant.getUI().showTrigger(this); // There is a corresponding hideTrigger too if needed
}
React Native

One can call "show" and "hide" methods as required to control the visibility of the Assistant

SlangRetailAssistant.ui.showTrigger(); // There is a corresponding hideTrigger too if needed
Web

One can call "show" and "hide" methods as required to control the visibility of the Assistant

SlangRetailAssistant.ui.show(); // There is a corresponding hide too if needed

The trigger is sticky, which means that it will show up on all Activities after it is made visible. To prevent the trigger from showing up on specific activities, you will need to call: SlangRetailAssistant.getUI().hideTrigger(this)

2.3 Implement Actions

Refresher: A UserJourney represents a path that a user may take to reach their goal when using a web or mobile app. See Voice Assistant Concepts for details.

Last but not the least, the app needs to implement the Actions associated with the various User Journeys supported by the Assistant. This can be done as shown below

Android Native
React Native
Web
Android Native
SlangRetailAssistant.setAction(new SlangRetailAssistant.Action() {
@Override
public SearchUserJourney.AppState onSearch(SearchInfo searchInfo, SearchUserJourney searchJourney) {
// Handle search requests
// ...
return SearchUserJourney.AppState.SEARCH_RESULTS;
}
public NavigationUserJourney.AppState onNavigation(NavigationInfo navInfo, NavigationUserJourney navJourney) {
// Handle navigation requests
// ...
return NavigationUserJourney.AppState.VIEW_ORDER;
}
public OrderManagementUserJourney.AppState onOrderManagement(OrderInfo orderInfo, OrderManagementUserJourney orderourney) {
// Handle order management requests
// ...
return OrderManagementUserJourney.AppState.NAVIGATION;
}
@Override
public void onAssistantError(final AssistantError error) {
// Handle errors that might have occurred during the Assistant lifecycle
// Error codes available
// FATAL_ERROR, SYSTEM_ERROR, ASSISTANT_DISABLED, INVALID_CREDENTIALS,
}
}
React Native
const actionHandler = {
onSearch: (searchInfo, searchUserJourney) => {
// Handle the search request
// ...
return SearchUserJourney.AppState.SEARCH_RESULTS;
},
onOrderManagement: (orderInfo, orderManagementUserJourney) => {
// Handle the order request
// ...
return OrderManagementUserJourney.AppState.VIEW_ORDER;
},
onNavigation: (navigationInfo, navigationUserJourney) => {
// Handle the navigation request
// ...
return NavigationUserJourney.AppState.NAVIGATION;
},
onAssistantError: errorCode => {
// Handle errors that might have happened during the processing of the
// Assistant
// Error codes available
// FATAL_ERROR, SYSTEM_ERROR, ASSISTANT_DISABLED, INVALID_CREDENTIALS,
},
};
SlangRetailAssistant.setAction(retailAssistantListener)
Web
const actionHandler = {
onSearch: (searchInfo, searchUserJourney) => {
// Handle the search request
// ...
return SearchUserJourney.AppStates.SEARCH_RESULTS;
},
onManageOrder: (orderInfo, orderManagementUserJourney) => {
// Handle the order request
// ...
return OrderManagementUserJourney.AppStates.VIEW_ORDER;
},
onNavigation: (navigationInfo, navigationUserJourney) => {
// Handle the navigation request
// ...
return NavigationUserJourney.AppStates.NAVIGATION;
},
};
SlangRetailAssistant.setAction(retailAssistantListener)

The following user journeys are currently supported by the Slang Retail Assistant:

  • Voice Search

  • Voice Navigation

  • Voice Order Management

The Action Handler interface has an explicit callback for each of the supported user journeys. Whenever the Assistant detects the user journey the user is interested in (based on what they spoke), it invokes the callback associated with that user journey.

When these callbacks are invoked, the Assistant also passes the parametric data corresponding to user journey that the Assistant was able to gather. The app is then expected to:

  1. Consume the parametric data as needed

  2. Optionally launch appropriate UI actions

  3. Set appropriate conditions in the Assistant based on the app's internal state

  4. Return the AppState that the app transitioned to

2.4 Return the AppState

Refresher: An AppState typically corresponds to a screen that the app transitioned to based on user input. See Voice Assistant Concepts for details.

An AppState indicates which state the app transitioned to, based on the user-journey and parametric data that was passed to the app. The list of AppStates that is supported depends on the user journey.

Android Native
React Native
Web
Android Native
public SearchUserJourney.AppState onSearch(SearchInfo searchInfo, SearchUserJourney searchJourney) {
// Handle the Search requests
// ...
return SearchUserJourney.AppState.SEARCH_RESULTS;
}
React Native
onSearch: async (searchInfo, searchUserJourney) => {
// Handle the search request
// ...
return SearchUserJourney.AppState.SEARCH_RESULTS;
}
Web
onSearch: async (searchInfo, searchUserJourney) => {
// Handle the search request
// ...
return SearchUserJourney.AppStates.SEARCH_RESULTS;
}

2.4.1 Set AppState condition

AppState conditions represent more detailed states of the app within a particular app state. For example, when performing a search, the search might have failed or the items might be out of stock. The app can use AppState conditions to indicate to the Assistant the correct condition. The condition controls the message that the Assistant speaks up after the call-back returns.

Android Native
React Native
Web
Android Native
public SearchUserJourney.AppState onSearch(SearchInfo searchInfo, SearchUserJourney searchJourney) {
// Handle the Search requests
// ...
searchUserJourney.setSearchSuccess(); // Set the condition
return SearchUserJourney.AppState.SEARCH_RESULTS;
}
React Native
onSearch: async (searchInfo, searchUserJourney) => {
// Handle the search request
// ...
searchUserJourney.setSearchSuccess();
return SearchUserJourney.AppState.SEARCH_RESULTS;
}
Web
onSearch: async (searchInfo, searchUserJourney) => {
// Handle the search request
// ...
searchUserJourney.setSuccess();
return SearchUserJourney.AppState.SEARCH_RESULTS;
}

2.4.2 Assistant Prompts

Based on the AppState returned and the conditions that were set, the Assistant will speak out an appropriate message to the user.

The prompts spoken by the Assistant are customizable. Refer to the Customizing the Assistant section, if you're interested in customization.

That's it! These are the basic set of steps required to add Slang's In-App Voice Assistant into your app.

Beyond this integration, Slang Voice Assistants provide a lot more power and flexibility to cater to more advanced needs of apps. Please refer to the Advanced Concepts section for more details if you're looking for more advanced features.