Code Integration - Basic steps

This section covers what you need to in code to integrate the Slang's Retail In-App Voice Assistant into your application

By now you must have configured and published your Assistant via the Slang Console. Congratulations :) If you have not done it already, you can do that by following the instructions here

The next step is the more fun part. Writing code. But don't worry. You don't need to do a lot of it. Just a few lines and you are done.

While the mechanism is similar, the exact steps to do that depends on the type of app you have.

  • Android Native app

  • React Native app

  • Web app

Let's start coding!

Note that we recommend using a physical Android device instead of an emulator because most emulators don't work well with microphones.

Build Setup

The first step is to setup the build system to include Slang's Retail Assistant SDK.

Java
React Native
Web
Java

Add the Slang dependency to your gradle files

Add the path to the Slang maven repository to your top-level gradle file

# Add this to your top level gradle file
allprojects {
repositories {
maven { url "http://maven.slanglabs.in:8080/artifactory/gradle-release" }
}
}

Add the Slang Retails Assistant dependency to your app's gradle file

# Add this to your app's gradle file
dependencies {
implementation 'in.slanglabs.assistants:slang-retail-assistant:0.0.51.3'
}
React Native

Install the Slang Retail Assistant package

The next step is to install the required packages inside your code repo

yarn setup

If you use yarm for install packages, run the below command

$ yarn add @slanglabs/react-native-slang-retail-assistant

npm setup

If you use npm for managing your packages, run the below command

$ npm install @slanglabs/react-native-slang-retail-assistant --save

Because Slang uses native libraries, you need to link the package to your codebase to run the automatic linking steps

$ react-native link react-native link @slanglabs/react-native-slang-retail-assistant

Finally, add the path to the Slang maven repository (to download native library dependencies) to your top-level gradle file

# Add this to your top level gradle file
allprojects {
repositories {
maven { url "http://maven.slanglabs.in:8080/artifactory/gradle-release" }
}
}
Web

TBD

Basic Code integration

Initialization

The next step is to initialize the SDK with the keys you obtained when creating the Assistant in the Slang console.

Java
React Native
Web
Java

The ideal place to do this is in the onCreate method of the Application class.

// Your application class
protected void onCreate(Bundle savedInstance) {
...
AssistantConfiguration configuration = new AssistantConfiguration.Builder()
.setAPIKey(<API Key>)
.setAssistantId(<AssistantId>)
.build();
SlangRetailAssistant.initialize(this, configuration);
}
React Native

This should ideally be done in the componentDidMount of your main app component

import SlangRetailAssistant from '@slanglabs/react-native-slang-retail-assistant';
SlangRetailAssistant.initialize({
requestedLocales: ['en-IN', 'hi-IN'], // The languages to enable
assistantId: '<assistant id>', // The Assistant ID from the console
apiKey: '<API Key>', // The API key from the console
})
Web

TBD

Show the Trigger (microphone icon)

Once the Assistant is initialized, the next step is to show the UI element (the microphone icon or what we call the Trigger) that the app's users can click, to invoke the Assistant and speak to it.

Android Native
React Native
Web
Android Native

Add the below line to the onResume method of the Activities where you want the Assistant to be enabled.

protected void onResume(Bundle savedInstance) {
...
SlangRetailAssistant.getUI().showTrigger(this); // There is a corresponding hideTrigger too if needed
}
React Native

One can call "show" and "hide" methods as required to control the visibility of the Assistant

SlangRetailAssistant.ui.showTrigger(); // There is a corresponding hideTrigger too if needed
Web

The trigger is sticky. That is it will show up on all Activities after it is made visible. One needs to call "hideTrigger" explicitly to hide it.

Implement Actions

Last but not the least, the app needs to implement the Actions associated with the various User Journeys supported by the Assistant. It can be done as shown below

Android Native
React Native
Web
Android Native
SlangRetailAssistant.setAction(new SlangRetailAssistant.Action() {
@Override
public SearchUserJourney.AppState onSearch(SearchInfo searchInfo, SearchUserJourney searchJourney) {
// Handle the Search requests
// ...
return SearchUserJourney.AppState.SEARCH_RESULTS;
}
public NavigationUserJourney.AppState onNavigation(NavigationInfo navInfo, NavigationUserJourney navJourney) {
// Handle the navigation requests
// ...
return NavigationUserJourney.AppState.VIEW_ORDER;
}
public OrderManagementUserJourney.AppState onOrderManagement(OrderInfo orderInfo, OrderManagementUserJourney orderourney) {
// Handle order management requests
// ...
return OrderManagementUserJourney.AppState.NAVIGATION;
}
@Override
public void onAssistantError(final AssistantError error) {
// Handle errors that might have happened during the processing of the
// Assistant
// Error codes available
// FATAL_ERROR, SYSTEM_ERROR, ASSISTANT_DISABLED, INVALID_CREDENTIALS,
}
}
React Native
const actionHandler = {
onSearch: (searchInfo, searchUserJourney) => {
// Handle the search request
// ...
return SearchUserJourney.AppState.SEARCH_RESULTS;
},
onOrderManagement: (orderInfo, orderManagementUserJourney) => {
// Handle the order request
// ...
return OrderManagementUserJourney.AppState.VIEW_ORDER;
},
onNavigation: (navigationInfo, navigationUserJourney) => {
// Handle the navigation request
// ...
return NavigationUserJourney.AppState.NAVIGATION;
},
onAssistantError: errorCode => {
// Handle errors that might have happened during the processing of the
// Assistant
// Error codes available
// FATAL_ERROR, SYSTEM_ERROR, ASSISTANT_DISABLED, INVALID_CREDENTIALS,
},
};
SlangRetailAssistant.setAction(retailAssistantListener)
Web

TBD

The following user journeys are supported currently

  • Voice Search

  • Voice Navigation

  • Voice augmented Order Management

The Action Handler interface has an explicit callback for each of the supported user journeys.

Refer to the section about User Journeys and App States to better understand those concepts

Whenever the Assistant detects that the user journey the end-user is interested in (based on what they said), it will invoke the callback associated with that user journey.

When these callbacks are invoked, the Assistant will also pass the "info" object corresponding to additional data the Assistant was able to gather. The app is then expected -

  • Consume the "info" parameter as needed

  • Optionally launch appropriate UI actions

  • Set appropriate conditions in the Assistant using the "journey" parameter

  • Return the AppState that the app transitioned to

Returning App State

The App State indicates which state the app is going into based on the "info" that was passed to the app. The list of App States that is supported depends on the user journey.

Android Native
React Native
Web
Android Native
public SearchUserJourney.AppState onSearch(SearchInfo searchInfo, SearchUserJourney searchJourney) {
// Handle the Search requests
// ...
return SearchUserJourney.AppState.SEARCH_RESULTS;
}
React Native
onSearch: async (searchInfo, searchUserJourney) => {
// Handle the search request
// ...
return SearchUserJourney.AppState.SEARCH_RESULTS;
}
Web
TBD

Setting App State Conditions

App State Conditions represent the sub-state or the condition of the app inside that particular app state. Eg when performing the search, it might have failed or the items might be out of stock or not available, The app can use the App State Condition to indicate to the Assistant the correct condition. The condition controls the message that the Assistant speaks up after the call-back returns.

Android Native
React Native
Web
Android Native
public SearchUserJourney.AppState onSearch(SearchInfo searchInfo, SearchUserJourney searchJourney) {
// Handle the Search requests
// ...
searchUserJourney.setSearchSuccess(); // Set the condition
return SearchUserJourney.AppState.SEARCH_RESULTS;
}
React Native
onSearch: async (searchInfo, searchUserJourney) => {
// Handle the search request
// ...
searchUserJourney.setSearchSuccess();
return SearchUserJourney.AppState.SEARCH_RESULTS;
}
Web

Assistant Prompts

Now based on the App State and the Condition that was set, the Assistant will speak out an appropriate message to the user.

To customize the prompts being spoken by the Assistant, refer to the Customizing the Assistant section

That's it. These are the basic set of steps required to add Slang's In-App Voice Assistant into your app.

There are a lot more power and flexiblity in consuming these Assistants. Refer to the Advanced Coding section for more details if you need that.

Let's get into details of each of the journeys.