By now you must have configured and published your Assistant via the Slang Console. Congratulations! :) If you have not already done that, you can do so by following the instructions here.
While the overall idea is similar across platforms, the specific steps involved vary slightly based on the platform on which your app is built. Supported platforms are:
Android Native
React Native for Android
Web (JS)
Let's start coding!
For testing, we recommend using a physical Android device instead of an emulator because most emulators don't work well with microphones.
The first step is to update the app's build system to include Slang's Retail Assistant SDK.
Add the path to the Slang maven repository to your top-level gradle file
# Add this to your top level gradle fileallprojects {repositories {…maven { url "http://maven.slanglabs.in:8080/artifactory/gradle-release" }}}
Add the Slang Retails Assistant dependency to your app's gradle file
# Add this to your app's gradle filedependencies {…implementation 'in.slanglabs.assistants:slang-retail-assistant:4.0.1'}
The next step is to install the required packages inside your code repo
If you use yarn for install packages, run the below command
$ yarn add @slanglabs/slang-conva-react-native-retail-assistant
If you use npm for managing your packages, run the below command
$ npm install @slanglabs/slang-conva-react-native-retail-assistant --save
Because Slang uses native libraries, you need to link the package to your codebase to run the automatic linking steps
$ react-native link @slanglabs/slang-conva-react-native-retail-assistant
Finally, add the path to the Slang maven repository (to download native library dependencies) to your top-level gradle file
# Add this to your top level gradle fileallprojects {repositories {…maven { url "http://maven.slanglabs.in:8080/artifactory/gradle-release" }}}
The next step is to install the required packages inside your code repo
If you use yarn for install packages, run the below command
$ yarn add @slanglabs/[email protected]
If you use npm for managing your packages, run the below command
$ npm install @slanglabs/[email protected] --save
The next step is to initialize the SDK with the keys you obtained after creating the Assistant in the Slang console.
The recommendation is to perform the initialization in the onCreate
method of the Application
class. If the app does not use an Application
class, the next best place would be the onCreate
method of the primary Activity
class.
// Your application classprotected void onCreate(Bundle savedInstance) {...AssistantConfiguration configuration = new AssistantConfiguration.Builder().setAPIKey(<API Key>).setAssistantId(<AssistantId>).setEnvironment(STAGING) // Change this to PRODUCTION once you've published the Assistant to production environment.build();SlangRetailAssistant.initialize(this, configuration);}
This should ideally be done in the componentDidMount of your main app component
import SlangRetailAssistant from '@slanglabs/react-native-slang-retail-assistant';SlangRetailAssistant.initialize({requestedLocales: ['en-IN', 'hi-IN'], // The languages to enableassistantId: '<assistant id>', // The Assistant ID from the consoleapiKey: '<API Key>', // The API key from the console})
import SlangRetailAssistant from '@slanglabs/slang-retail-assistant';SlangRetailAssistant.initialize({requestedLocales: ['en-IN', 'hi-IN'], // The languages to enableassistantID: '<assistant id>', // The Assistant ID from the consoleapiKey: '<API Key>', // The API key from the console})
Once the Assistant is initialized, the next step is to show the microphone UI element (what we call the Trigger) that the app's users can click on to invoke the Assistant and speak to it.
Add the below line to the onResume
method of the Activities where you want the Assistant to be enabled.
protected void onResume(Bundle savedInstance) {...SlangRetailAssistant.getUI().showTrigger(this); // There is a corresponding hideTrigger too if needed}
One can call "show" and "hide" methods as required to control the visibility of the Assistant
SlangRetailAssistant.ui.showTrigger(); // There is a corresponding hideTrigger too if needed
One can call "show" and "hide" methods as required to control the visibility of the Assistant
SlangRetailAssistant.ui.show(); // There is a corresponding hide too if needed
The trigger is sticky, which means that it will show up on all Activities after it is made visible. To prevent the trigger from showing up on specific activities, you will need to call: SlangRetailAssistant.getUI().hideTrigger(this)
Refresher: A UserJourney
represents a path that a user may take to reach their goal when using a web or mobile app. See Voice Assistant Concepts for details.
Last but not the least, the app needs to implement the Actions associated with the various User Journeys supported by the Assistant. This can be done as shown below
SlangRetailAssistant.setAction(new SlangRetailAssistant.Action() {@Overridepublic SearchUserJourney.AppState onSearch(SearchInfo searchInfo, SearchUserJourney searchJourney) {// Handle search requests// ...return SearchUserJourney.AppState.SEARCH_RESULTS;}public NavigationUserJourney.AppState onNavigation(NavigationInfo navInfo, NavigationUserJourney navJourney) {// Handle navigation requests// ...return NavigationUserJourney.AppState.VIEW_ORDER;}public OrderManagementUserJourney.AppState onOrderManagement(OrderInfo orderInfo, OrderManagementUserJourney orderourney) {// Handle order management requests// ...return OrderManagementUserJourney.AppState.NAVIGATION;}@Overridepublic void onAssistantError(final AssistantError error) {// Handle errors that might have occurred during the Assistant lifecycle// Error codes available// FATAL_ERROR, SYSTEM_ERROR, ASSISTANT_DISABLED, INVALID_CREDENTIALS,}}
const actionHandler = {onSearch: (searchInfo, searchUserJourney) => {// Handle the search request// ...return SearchUserJourney.AppState.SEARCH_RESULTS;},onOrderManagement: (orderInfo, orderManagementUserJourney) => {// Handle the order request// ...return OrderManagementUserJourney.AppState.VIEW_ORDER;},onNavigation: (navigationInfo, navigationUserJourney) => {// Handle the navigation request// ...return NavigationUserJourney.AppState.NAVIGATION;},onAssistantError: errorCode => {// Handle errors that might have happened during the processing of the// Assistant// Error codes available// FATAL_ERROR, SYSTEM_ERROR, ASSISTANT_DISABLED, INVALID_CREDENTIALS,},};SlangRetailAssistant.setAction(retailAssistantListener)
const actionHandler = {onSearch: (searchInfo, searchUserJourney) => {// Handle the search request// ...return SearchUserJourney.AppStates.SEARCH_RESULTS;},onManageOrder: (orderInfo, orderManagementUserJourney) => {// Handle the order request// ...return OrderManagementUserJourney.AppStates.VIEW_ORDER;},onNavigation: (navigationInfo, navigationUserJourney) => {// Handle the navigation request// ...return NavigationUserJourney.AppStates.NAVIGATION;},};SlangRetailAssistant.setAction(retailAssistantListener)
The following user journeys are currently supported by the Slang Retail Assistant:
Voice Search
Voice Navigation
Voice Order Management
The Action Handler interface has an explicit callback for each of the supported user journeys. Whenever the Assistant detects the user journey the user is interested in (based on what they spoke), it invokes the callback associated with that user journey.
When these callbacks are invoked, the Assistant also passes the parametric data corresponding to user journey that the Assistant was able to gather. The app is then expected to:
Consume the parametric data as needed
Optionally launch appropriate UI actions
Set appropriate conditions in the Assistant based on the app's internal state
Return the AppState
that the app transitioned to
Refresher: An AppState
typically corresponds to a screen that the app transitioned to based on user input. See Voice Assistant Concepts for details.
An AppState
indicates which state the app transitioned to, based on the user-journey and parametric data that was passed to the app. The list of AppState
s that is supported depends on the user journey.
public SearchUserJourney.AppState onSearch(SearchInfo searchInfo, SearchUserJourney searchJourney) {// Handle the Search requests// ...return SearchUserJourney.AppState.SEARCH_RESULTS;}
onSearch: async (searchInfo, searchUserJourney) => {// Handle the search request// ...return SearchUserJourney.AppState.SEARCH_RESULTS;}
onSearch: async (searchInfo, searchUserJourney) => {// Handle the search request// ...return SearchUserJourney.AppStates.SEARCH_RESULTS;}
AppState
conditions represent more detailed states of the app within a particular app state. For example, when performing a search, the search might have failed or the items might be out of stock. The app can use AppState
conditions to indicate to the Assistant the correct condition. The condition controls the message that the Assistant speaks up after the call-back returns.
public SearchUserJourney.AppState onSearch(SearchInfo searchInfo, SearchUserJourney searchJourney) {// Handle the Search requests// ...searchUserJourney.setSearchSuccess(); // Set the conditionreturn SearchUserJourney.AppState.SEARCH_RESULTS;}
onSearch: async (searchInfo, searchUserJourney) => {// Handle the search request// ...searchUserJourney.setSearchSuccess();return SearchUserJourney.AppState.SEARCH_RESULTS;}
onSearch: async (searchInfo, searchUserJourney) => {// Handle the search request// ...searchUserJourney.setSuccess();return SearchUserJourney.AppState.SEARCH_RESULTS;}
Based on the AppState
returned and the conditions that were set, the Assistant will speak out an appropriate message to the user.
The prompts spoken by the Assistant are customizable. Refer to the Customizing the Assistant section, if you're interested in customization.
That's it! These are the basic set of steps required to add Slang's In-App Voice Assistant into your app.
Beyond this integration, Slang Voice Assistants provide a lot more power and flexibility to cater to more advanced needs of apps. Please refer to the Advanced Concepts section for more details if you're looking for more advanced features.